id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
bongo2112/moodewji-v2-SDxl-output-images | 2023-09-14T09:11:56.000Z | [
"region:us"
] | bongo2112 | null | null | null | 0 | 0 | Entry not found |
champkris/nadedataset | 2023-09-14T07:53:08.000Z | [
"region:us"
] | champkris | null | null | null | 0 | 0 | Entry not found |
tanmay2798/trainontext | 2023-09-14T07:53:53.000Z | [
"size_categories:n<1K",
"language:en",
"license:unknown",
"region:us"
] | tanmay2798 | null | null | null | 0 | 0 | ---
license: unknown
language:
- en
size_categories:
- n<1K
--- |
open-llm-leaderboard/details_vihangd__smartyplats-3b-v2 | 2023-09-14T07:54:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of vihangd/smartyplats-3b-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [vihangd/smartyplats-3b-v2](https://huggingface.co/vihangd/smartyplats-3b-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vihangd__smartyplats-3b-v2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-14T07:53:11.714726](https://huggingface.co/datasets/open-llm-leaderboard/details_vihangd__smartyplats-3b-v2/blob/main/results_2023-09-14T07-53-11.714726.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25015806921375316,\n\
\ \"acc_stderr\": 0.031235552038968953,\n \"acc_norm\": 0.25399784255330154,\n\
\ \"acc_norm_stderr\": 0.03123265058482885,\n \"mc1\": 0.2252141982864137,\n\
\ \"mc1_stderr\": 0.014623240768023493,\n \"mc2\": 0.36661381284093036,\n\
\ \"mc2_stderr\": 0.01373944353058763\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.36689419795221845,\n \"acc_stderr\": 0.014084133118104289,\n\
\ \"acc_norm\": 0.4104095563139932,\n \"acc_norm_stderr\": 0.014374922192642662\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5288787094204341,\n\
\ \"acc_stderr\": 0.004981451704451047,\n \"acc_norm\": 0.7119099780920135,\n\
\ \"acc_norm_stderr\": 0.004519476835646771\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036843,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036843\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.1925925925925926,\n\
\ \"acc_stderr\": 0.03406542058502653,\n \"acc_norm\": 0.1925925925925926,\n\
\ \"acc_norm_stderr\": 0.03406542058502653\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21710526315789475,\n \"acc_stderr\": 0.03355045304882922,\n\
\ \"acc_norm\": 0.21710526315789475,\n \"acc_norm_stderr\": 0.03355045304882922\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23018867924528302,\n \"acc_stderr\": 0.02590789712240817,\n\
\ \"acc_norm\": 0.23018867924528302,\n \"acc_norm_stderr\": 0.02590789712240817\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.19653179190751446,\n\
\ \"acc_stderr\": 0.030299574664788137,\n \"acc_norm\": 0.19653179190751446,\n\
\ \"acc_norm_stderr\": 0.030299574664788137\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3021276595744681,\n \"acc_stderr\": 0.030017554471880557,\n\
\ \"acc_norm\": 0.3021276595744681,\n \"acc_norm_stderr\": 0.030017554471880557\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.038924311065187525,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.038924311065187525\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20634920634920634,\n \"acc_stderr\": 0.020842290930114676,\n \"\
acc_norm\": 0.20634920634920634,\n \"acc_norm_stderr\": 0.020842290930114676\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471276,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471276\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.17096774193548386,\n\
\ \"acc_stderr\": 0.021417242936321575,\n \"acc_norm\": 0.17096774193548386,\n\
\ \"acc_norm_stderr\": 0.021417242936321575\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.1724137931034483,\n \"acc_stderr\": 0.026577672183036583,\n\
\ \"acc_norm\": 0.1724137931034483,\n \"acc_norm_stderr\": 0.026577672183036583\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.15151515151515152,\n \"acc_stderr\": 0.025545650426603592,\n \"\
acc_norm\": 0.15151515151515152,\n \"acc_norm_stderr\": 0.025545650426603592\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19170984455958548,\n \"acc_stderr\": 0.028408953626245296,\n\
\ \"acc_norm\": 0.19170984455958548,\n \"acc_norm_stderr\": 0.028408953626245296\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2230769230769231,\n \"acc_stderr\": 0.021107730127243998,\n\
\ \"acc_norm\": 0.2230769230769231,\n \"acc_norm_stderr\": 0.021107730127243998\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22962962962962963,\n \"acc_stderr\": 0.02564410863926763,\n \
\ \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.02564410863926763\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.02720537153827946,\n \
\ \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.02720537153827946\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008937,\n \"\
acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008937\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22568807339449543,\n \"acc_stderr\": 0.017923087667803046,\n \"\
acc_norm\": 0.22568807339449543,\n \"acc_norm_stderr\": 0.017923087667803046\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.18981481481481483,\n \"acc_stderr\": 0.026744714834691916,\n \"\
acc_norm\": 0.18981481481481483,\n \"acc_norm_stderr\": 0.026744714834691916\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.33183856502242154,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.33183856502242154,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.04010358942462203,\n\
\ \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.04010358942462203\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2066115702479339,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.2066115702479339,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1262135922330097,\n \"acc_stderr\": 0.03288180278808628,\n\
\ \"acc_norm\": 0.1262135922330097,\n \"acc_norm_stderr\": 0.03288180278808628\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.32051282051282054,\n\
\ \"acc_stderr\": 0.030572811310299607,\n \"acc_norm\": 0.32051282051282054,\n\
\ \"acc_norm_stderr\": 0.030572811310299607\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24521072796934865,\n\
\ \"acc_stderr\": 0.01538435228454394,\n \"acc_norm\": 0.24521072796934865,\n\
\ \"acc_norm_stderr\": 0.01538435228454394\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.02355083135199509,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.02355083135199509\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26688102893890675,\n\
\ \"acc_stderr\": 0.02512263760881665,\n \"acc_norm\": 0.26688102893890675,\n\
\ \"acc_norm_stderr\": 0.02512263760881665\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.02447722285613511,\n\
\ \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.02447722285613511\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.025892151156709405,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.025892151156709405\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.22816166883963493,\n\
\ \"acc_stderr\": 0.010717992192047889,\n \"acc_norm\": 0.22816166883963493,\n\
\ \"acc_norm_stderr\": 0.010717992192047889\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.19117647058823528,\n \"acc_stderr\": 0.02388688192244036,\n\
\ \"acc_norm\": 0.19117647058823528,\n \"acc_norm_stderr\": 0.02388688192244036\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25163398692810457,\n \"acc_stderr\": 0.017555818091322267,\n \
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.017555818091322267\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2727272727272727,\n\
\ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.2727272727272727,\n\
\ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19183673469387755,\n \"acc_stderr\": 0.025206963154225406,\n\
\ \"acc_norm\": 0.19183673469387755,\n \"acc_norm_stderr\": 0.025206963154225406\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.035650796707083106,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.035650796707083106\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2252141982864137,\n\
\ \"mc1_stderr\": 0.014623240768023493,\n \"mc2\": 0.36661381284093036,\n\
\ \"mc2_stderr\": 0.01373944353058763\n }\n}\n```"
repo_url: https://huggingface.co/vihangd/smartyplats-3b-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|arc:challenge|25_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hellaswag|10_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T07-53-11.714726.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T07-53-11.714726.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T07-53-11.714726.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T07-53-11.714726.parquet'
- config_name: results
data_files:
- split: 2023_09_14T07_53_11.714726
path:
- results_2023-09-14T07-53-11.714726.parquet
- split: latest
path:
- results_2023-09-14T07-53-11.714726.parquet
---
# Dataset Card for Evaluation run of vihangd/smartyplats-3b-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/vihangd/smartyplats-3b-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [vihangd/smartyplats-3b-v2](https://huggingface.co/vihangd/smartyplats-3b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vihangd__smartyplats-3b-v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-14T07:53:11.714726](https://huggingface.co/datasets/open-llm-leaderboard/details_vihangd__smartyplats-3b-v2/blob/main/results_2023-09-14T07-53-11.714726.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25015806921375316,
"acc_stderr": 0.031235552038968953,
"acc_norm": 0.25399784255330154,
"acc_norm_stderr": 0.03123265058482885,
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023493,
"mc2": 0.36661381284093036,
"mc2_stderr": 0.01373944353058763
},
"harness|arc:challenge|25": {
"acc": 0.36689419795221845,
"acc_stderr": 0.014084133118104289,
"acc_norm": 0.4104095563139932,
"acc_norm_stderr": 0.014374922192642662
},
"harness|hellaswag|10": {
"acc": 0.5288787094204341,
"acc_stderr": 0.004981451704451047,
"acc_norm": 0.7119099780920135,
"acc_norm_stderr": 0.004519476835646771
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036843,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036843
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.1925925925925926,
"acc_stderr": 0.03406542058502653,
"acc_norm": 0.1925925925925926,
"acc_norm_stderr": 0.03406542058502653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21710526315789475,
"acc_stderr": 0.03355045304882922,
"acc_norm": 0.21710526315789475,
"acc_norm_stderr": 0.03355045304882922
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23018867924528302,
"acc_stderr": 0.02590789712240817,
"acc_norm": 0.23018867924528302,
"acc_norm_stderr": 0.02590789712240817
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.19653179190751446,
"acc_stderr": 0.030299574664788137,
"acc_norm": 0.19653179190751446,
"acc_norm_stderr": 0.030299574664788137
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3021276595744681,
"acc_stderr": 0.030017554471880557,
"acc_norm": 0.3021276595744681,
"acc_norm_stderr": 0.030017554471880557
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.038924311065187525,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.038924311065187525
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.020842290930114676,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.020842290930114676
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471276,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471276
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.17096774193548386,
"acc_stderr": 0.021417242936321575,
"acc_norm": 0.17096774193548386,
"acc_norm_stderr": 0.021417242936321575
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1724137931034483,
"acc_stderr": 0.026577672183036583,
"acc_norm": 0.1724137931034483,
"acc_norm_stderr": 0.026577672183036583
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.15151515151515152,
"acc_stderr": 0.025545650426603592,
"acc_norm": 0.15151515151515152,
"acc_norm_stderr": 0.025545650426603592
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19170984455958548,
"acc_stderr": 0.028408953626245296,
"acc_norm": 0.19170984455958548,
"acc_norm_stderr": 0.028408953626245296
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2230769230769231,
"acc_stderr": 0.021107730127243998,
"acc_norm": 0.2230769230769231,
"acc_norm_stderr": 0.021107730127243998
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.02564410863926763,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.02564410863926763
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.02720537153827946,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.02720537153827946
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008937,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008937
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22568807339449543,
"acc_stderr": 0.017923087667803046,
"acc_norm": 0.22568807339449543,
"acc_norm_stderr": 0.017923087667803046
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18981481481481483,
"acc_stderr": 0.026744714834691916,
"acc_norm": 0.18981481481481483,
"acc_norm_stderr": 0.026744714834691916
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.33183856502242154,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.33183856502242154,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.29770992366412213,
"acc_stderr": 0.04010358942462203,
"acc_norm": 0.29770992366412213,
"acc_norm_stderr": 0.04010358942462203
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2066115702479339,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.2066115702479339,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.1262135922330097,
"acc_stderr": 0.03288180278808628,
"acc_norm": 0.1262135922330097,
"acc_norm_stderr": 0.03288180278808628
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.32051282051282054,
"acc_stderr": 0.030572811310299607,
"acc_norm": 0.32051282051282054,
"acc_norm_stderr": 0.030572811310299607
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24521072796934865,
"acc_stderr": 0.01538435228454394,
"acc_norm": 0.24521072796934865,
"acc_norm_stderr": 0.01538435228454394
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.02355083135199509,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.02355083135199509
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26688102893890675,
"acc_stderr": 0.02512263760881665,
"acc_norm": 0.26688102893890675,
"acc_norm_stderr": 0.02512263760881665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2623456790123457,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.2623456790123457,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.025892151156709405,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.025892151156709405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.22816166883963493,
"acc_stderr": 0.010717992192047889,
"acc_norm": 0.22816166883963493,
"acc_norm_stderr": 0.010717992192047889
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19117647058823528,
"acc_stderr": 0.02388688192244036,
"acc_norm": 0.19117647058823528,
"acc_norm_stderr": 0.02388688192244036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.017555818091322267,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.017555818091322267
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19183673469387755,
"acc_stderr": 0.025206963154225406,
"acc_norm": 0.19183673469387755,
"acc_norm_stderr": 0.025206963154225406
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.035650796707083106,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.035650796707083106
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023493,
"mc2": 0.36661381284093036,
"mc2_stderr": 0.01373944353058763
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_HyperbeeAI__Tulpar-7b-v1 | 2023-09-14T08:00:56.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of HyperbeeAI/Tulpar-7b-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [HyperbeeAI/Tulpar-7b-v1](https://huggingface.co/HyperbeeAI/Tulpar-7b-v1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HyperbeeAI__Tulpar-7b-v1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-14T07:59:39.326009](https://huggingface.co/datasets/open-llm-leaderboard/details_HyperbeeAI__Tulpar-7b-v1/blob/main/results_2023-09-14T07-59-39.326009.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5154814101902333,\n\
\ \"acc_stderr\": 0.03490831850058395,\n \"acc_norm\": 0.5190980044936959,\n\
\ \"acc_norm_stderr\": 0.034892365378161205,\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.0167113581635444,\n \"mc2\": 0.5182558368766056,\n\
\ \"mc2_stderr\": 0.01571195010605344\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5418088737201365,\n \"acc_stderr\": 0.014560220308714698,\n\
\ \"acc_norm\": 0.5699658703071673,\n \"acc_norm_stderr\": 0.014467631559137988\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6116311491734714,\n\
\ \"acc_stderr\": 0.0048638313648480735,\n \"acc_norm\": 0.7968532164907389,\n\
\ \"acc_norm_stderr\": 0.004015185891482733\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03015113445777628,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03015113445777628\n },\n\
\ \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04155319955593146,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04155319955593146\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.4797687861271676,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179327,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179327\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.023266512213730564,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.023266512213730564\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604675,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604675\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.567741935483871,\n\
\ \"acc_stderr\": 0.028181739720019413,\n \"acc_norm\": 0.567741935483871,\n\
\ \"acc_norm_stderr\": 0.028181739720019413\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.0338640574606209,\n\
\ \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.0338640574606209\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.035243908445117815,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.035243908445117815\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6717171717171717,\n \"acc_stderr\": 0.03345678422756775,\n \"\
acc_norm\": 0.6717171717171717,\n \"acc_norm_stderr\": 0.03345678422756775\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7357512953367875,\n \"acc_stderr\": 0.03182155050916646,\n\
\ \"acc_norm\": 0.7357512953367875,\n \"acc_norm_stderr\": 0.03182155050916646\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.47435897435897434,\n \"acc_stderr\": 0.025317649726448656,\n\
\ \"acc_norm\": 0.47435897435897434,\n \"acc_norm_stderr\": 0.025317649726448656\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275815,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275815\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5084033613445378,\n \"acc_stderr\": 0.0324739027656967,\n \
\ \"acc_norm\": 0.5084033613445378,\n \"acc_norm_stderr\": 0.0324739027656967\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.726605504587156,\n \"acc_stderr\": 0.01910929984609829,\n \"acc_norm\"\
: 0.726605504587156,\n \"acc_norm_stderr\": 0.01910929984609829\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.032962451101722294,\n\
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.032962451101722294\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5919282511210763,\n\
\ \"acc_stderr\": 0.03298574607842822,\n \"acc_norm\": 0.5919282511210763,\n\
\ \"acc_norm_stderr\": 0.03298574607842822\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212094,\n \"\
acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n\
\ \"acc_stderr\": 0.04616631111801713,\n \"acc_norm\": 0.6481481481481481,\n\
\ \"acc_norm_stderr\": 0.04616631111801713\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.558282208588957,\n \"acc_stderr\": 0.03901591825836184,\n\
\ \"acc_norm\": 0.558282208588957,\n \"acc_norm_stderr\": 0.03901591825836184\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.045821241601615506,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.045821241601615506\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.782051282051282,\n\
\ \"acc_stderr\": 0.027046857630716684,\n \"acc_norm\": 0.782051282051282,\n\
\ \"acc_norm_stderr\": 0.027046857630716684\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7088122605363985,\n\
\ \"acc_stderr\": 0.016246087069701407,\n \"acc_norm\": 0.7088122605363985,\n\
\ \"acc_norm_stderr\": 0.016246087069701407\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5867052023121387,\n \"acc_stderr\": 0.02651126136940924,\n\
\ \"acc_norm\": 0.5867052023121387,\n \"acc_norm_stderr\": 0.02651126136940924\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2558659217877095,\n\
\ \"acc_stderr\": 0.014593620923210754,\n \"acc_norm\": 0.2558659217877095,\n\
\ \"acc_norm_stderr\": 0.014593620923210754\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5163398692810458,\n \"acc_stderr\": 0.02861462475280544,\n\
\ \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.02861462475280544\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n\
\ \"acc_stderr\": 0.027882383791325953,\n \"acc_norm\": 0.594855305466238,\n\
\ \"acc_norm_stderr\": 0.027882383791325953\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5617283950617284,\n \"acc_stderr\": 0.027607914087400473,\n\
\ \"acc_norm\": 0.5617283950617284,\n \"acc_norm_stderr\": 0.027607914087400473\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614105,\n \
\ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614105\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39113428943937417,\n\
\ \"acc_stderr\": 0.012463861839982061,\n \"acc_norm\": 0.39113428943937417,\n\
\ \"acc_norm_stderr\": 0.012463861839982061\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.49673202614379086,\n \"acc_stderr\": 0.020227402794434864,\n \
\ \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.020227402794434864\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.031680911612338825,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.031680911612338825\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n\
\ \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.6716417910447762,\n\
\ \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.0167113581635444,\n \"mc2\": 0.5182558368766056,\n\
\ \"mc2_stderr\": 0.01571195010605344\n }\n}\n```"
repo_url: https://huggingface.co/HyperbeeAI/Tulpar-7b-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|arc:challenge|25_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hellaswag|10_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T07-59-39.326009.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T07-59-39.326009.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T07-59-39.326009.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T07-59-39.326009.parquet'
- config_name: results
data_files:
- split: 2023_09_14T07_59_39.326009
path:
- results_2023-09-14T07-59-39.326009.parquet
- split: latest
path:
- results_2023-09-14T07-59-39.326009.parquet
---
# Dataset Card for Evaluation run of HyperbeeAI/Tulpar-7b-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/HyperbeeAI/Tulpar-7b-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [HyperbeeAI/Tulpar-7b-v1](https://huggingface.co/HyperbeeAI/Tulpar-7b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HyperbeeAI__Tulpar-7b-v1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-14T07:59:39.326009](https://huggingface.co/datasets/open-llm-leaderboard/details_HyperbeeAI__Tulpar-7b-v1/blob/main/results_2023-09-14T07-59-39.326009.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5154814101902333,
"acc_stderr": 0.03490831850058395,
"acc_norm": 0.5190980044936959,
"acc_norm_stderr": 0.034892365378161205,
"mc1": 0.35128518971848227,
"mc1_stderr": 0.0167113581635444,
"mc2": 0.5182558368766056,
"mc2_stderr": 0.01571195010605344
},
"harness|arc:challenge|25": {
"acc": 0.5418088737201365,
"acc_stderr": 0.014560220308714698,
"acc_norm": 0.5699658703071673,
"acc_norm_stderr": 0.014467631559137988
},
"harness|hellaswag|10": {
"acc": 0.6116311491734714,
"acc_stderr": 0.0048638313648480735,
"acc_norm": 0.7968532164907389,
"acc_norm_stderr": 0.004015185891482733
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.03015113445777628,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03015113445777628
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04155319955593146,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04155319955593146
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179327,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179327
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.023266512213730564,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.023266512213730564
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604675,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604675
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.567741935483871,
"acc_stderr": 0.028181739720019413,
"acc_norm": 0.567741935483871,
"acc_norm_stderr": 0.028181739720019413
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.035243908445117815,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.035243908445117815
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6717171717171717,
"acc_stderr": 0.03345678422756775,
"acc_norm": 0.6717171717171717,
"acc_norm_stderr": 0.03345678422756775
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7357512953367875,
"acc_stderr": 0.03182155050916646,
"acc_norm": 0.7357512953367875,
"acc_norm_stderr": 0.03182155050916646
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.47435897435897434,
"acc_stderr": 0.025317649726448656,
"acc_norm": 0.47435897435897434,
"acc_norm_stderr": 0.025317649726448656
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275815,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275815
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5084033613445378,
"acc_stderr": 0.0324739027656967,
"acc_norm": 0.5084033613445378,
"acc_norm_stderr": 0.0324739027656967
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.726605504587156,
"acc_stderr": 0.01910929984609829,
"acc_norm": 0.726605504587156,
"acc_norm_stderr": 0.01910929984609829
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.032962451101722294,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.032962451101722294
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5919282511210763,
"acc_stderr": 0.03298574607842822,
"acc_norm": 0.5919282511210763,
"acc_norm_stderr": 0.03298574607842822
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.04616631111801713,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.04616631111801713
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.558282208588957,
"acc_stderr": 0.03901591825836184,
"acc_norm": 0.558282208588957,
"acc_norm_stderr": 0.03901591825836184
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.045821241601615506,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.045821241601615506
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.782051282051282,
"acc_stderr": 0.027046857630716684,
"acc_norm": 0.782051282051282,
"acc_norm_stderr": 0.027046857630716684
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7088122605363985,
"acc_stderr": 0.016246087069701407,
"acc_norm": 0.7088122605363985,
"acc_norm_stderr": 0.016246087069701407
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5867052023121387,
"acc_stderr": 0.02651126136940924,
"acc_norm": 0.5867052023121387,
"acc_norm_stderr": 0.02651126136940924
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2558659217877095,
"acc_stderr": 0.014593620923210754,
"acc_norm": 0.2558659217877095,
"acc_norm_stderr": 0.014593620923210754
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.02861462475280544,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.02861462475280544
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.594855305466238,
"acc_stderr": 0.027882383791325953,
"acc_norm": 0.594855305466238,
"acc_norm_stderr": 0.027882383791325953
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5617283950617284,
"acc_stderr": 0.027607914087400473,
"acc_norm": 0.5617283950617284,
"acc_norm_stderr": 0.027607914087400473
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.028947338851614105,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.028947338851614105
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39113428943937417,
"acc_stderr": 0.012463861839982061,
"acc_norm": 0.39113428943937417,
"acc_norm_stderr": 0.012463861839982061
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.020227402794434864,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.020227402794434864
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.031680911612338825,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.031680911612338825
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6716417910447762,
"acc_stderr": 0.033206858897443244,
"acc_norm": 0.6716417910447762,
"acc_norm_stderr": 0.033206858897443244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.0352821125824523,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.0352821125824523
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35128518971848227,
"mc1_stderr": 0.0167113581635444,
"mc2": 0.5182558368766056,
"mc2_stderr": 0.01571195010605344
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
marasama/nva-Leucochloridium | 2023-09-14T10:47:40.000Z | [
"region:us"
] | marasama | null | null | null | 0 | 0 | Entry not found |
CyberHarem/morikubo_nono_idolmastercinderellagirls | 2023-09-17T17:36:54.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of morikubo_nono (THE iDOLM@STER: Cinderella Girls)
This is the dataset of morikubo_nono (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 504 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 504 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 504 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 504 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_lu-vae__llama2-13B-sharegpt4-orca-openplatypus-8w | 2023-09-14T08:20:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of lu-vae/llama2-13B-sharegpt4-orca-openplatypus-8w
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lu-vae/llama2-13B-sharegpt4-orca-openplatypus-8w](https://huggingface.co/lu-vae/llama2-13B-sharegpt4-orca-openplatypus-8w)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lu-vae__llama2-13B-sharegpt4-orca-openplatypus-8w\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-14T08:19:15.271974](https://huggingface.co/datasets/open-llm-leaderboard/details_lu-vae__llama2-13B-sharegpt4-orca-openplatypus-8w/blob/main/results_2023-09-14T08-19-15.271974.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5534484375858194,\n\
\ \"acc_stderr\": 0.034349475920530226,\n \"acc_norm\": 0.5575469466452477,\n\
\ \"acc_norm_stderr\": 0.0343257434981978,\n \"mc1\": 0.32802937576499386,\n\
\ \"mc1_stderr\": 0.016435632932815025,\n \"mc2\": 0.4565749047878431,\n\
\ \"mc2_stderr\": 0.015269312980089841\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n\
\ \"acc_norm\": 0.6279863481228669,\n \"acc_norm_stderr\": 0.014124597881844461\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6395140410276837,\n\
\ \"acc_stderr\": 0.004791601975612765,\n \"acc_norm\": 0.8403704441346346,\n\
\ \"acc_norm_stderr\": 0.003655136111553706\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.04033565667848319,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.04033565667848319\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.030197611600197943,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.030197611600197943\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.5664739884393064,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.0416656757710158,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.0416656757710158\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3492063492063492,\n \"acc_stderr\": 0.024552292209342654,\n \"\
acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.024552292209342654\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n\
\ \"acc_stderr\": 0.024580028921480996,\n \"acc_norm\": 0.24838709677419354,\n\
\ \"acc_norm_stderr\": 0.024580028921480996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.034867317274198714,\n\
\ \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.034867317274198714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6111111111111112,\n \"acc_stderr\": 0.0347327959083696,\n \"acc_norm\"\
: 0.6111111111111112,\n \"acc_norm_stderr\": 0.0347327959083696\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.03003114797764154,\n\
\ \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.03003114797764154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.02493931390694079,\n \
\ \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.02493931390694079\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547307,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547307\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.03142946637883708,\n \
\ \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.03142946637883708\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.037579499229433426,\n \"\
acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.037579499229433426\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7834862385321101,\n \"acc_stderr\": 0.01765871059444313,\n \"\
acc_norm\": 0.7834862385321101,\n \"acc_norm_stderr\": 0.01765871059444313\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.033723432716530645,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.033723432716530645\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n\
\ \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.041391127276354626,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.041391127276354626\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.03731133519673893,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.03731133519673893\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.045218299028335865,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.045218299028335865\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.027236013946196697,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.027236013946196697\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n\
\ \"acc_stderr\": 0.015246803197398675,\n \"acc_norm\": 0.7611749680715197,\n\
\ \"acc_norm_stderr\": 0.015246803197398675\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.638728323699422,\n \"acc_stderr\": 0.025862201852277878,\n\
\ \"acc_norm\": 0.638728323699422,\n \"acc_norm_stderr\": 0.025862201852277878\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36312849162011174,\n\
\ \"acc_stderr\": 0.016083749986853704,\n \"acc_norm\": 0.36312849162011174,\n\
\ \"acc_norm_stderr\": 0.016083749986853704\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5718954248366013,\n \"acc_stderr\": 0.028332397483664274,\n\
\ \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.028332397483664274\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.02731684767419271,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.02731684767419271\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.025773111169630446,\n\
\ \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.025773111169630446\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236844,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236844\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4745762711864407,\n\
\ \"acc_stderr\": 0.012753716929101006,\n \"acc_norm\": 0.4745762711864407,\n\
\ \"acc_norm_stderr\": 0.012753716929101006\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n\
\ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5882352941176471,\n \"acc_stderr\": 0.019910377463105935,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.019910377463105935\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5551020408163265,\n \"acc_stderr\": 0.03181425118197786,\n\
\ \"acc_norm\": 0.5551020408163265,\n \"acc_norm_stderr\": 0.03181425118197786\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.15920398009950248,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.15920398009950248,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32802937576499386,\n\
\ \"mc1_stderr\": 0.016435632932815025,\n \"mc2\": 0.4565749047878431,\n\
\ \"mc2_stderr\": 0.015269312980089841\n }\n}\n```"
repo_url: https://huggingface.co/lu-vae/llama2-13B-sharegpt4-orca-openplatypus-8w
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|arc:challenge|25_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hellaswag|10_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T08-19-15.271974.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T08-19-15.271974.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T08-19-15.271974.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T08-19-15.271974.parquet'
- config_name: results
data_files:
- split: 2023_09_14T08_19_15.271974
path:
- results_2023-09-14T08-19-15.271974.parquet
- split: latest
path:
- results_2023-09-14T08-19-15.271974.parquet
---
# Dataset Card for Evaluation run of lu-vae/llama2-13B-sharegpt4-orca-openplatypus-8w
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lu-vae/llama2-13B-sharegpt4-orca-openplatypus-8w
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lu-vae/llama2-13B-sharegpt4-orca-openplatypus-8w](https://huggingface.co/lu-vae/llama2-13B-sharegpt4-orca-openplatypus-8w) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lu-vae__llama2-13B-sharegpt4-orca-openplatypus-8w",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-14T08:19:15.271974](https://huggingface.co/datasets/open-llm-leaderboard/details_lu-vae__llama2-13B-sharegpt4-orca-openplatypus-8w/blob/main/results_2023-09-14T08-19-15.271974.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5534484375858194,
"acc_stderr": 0.034349475920530226,
"acc_norm": 0.5575469466452477,
"acc_norm_stderr": 0.0343257434981978,
"mc1": 0.32802937576499386,
"mc1_stderr": 0.016435632932815025,
"mc2": 0.4565749047878431,
"mc2_stderr": 0.015269312980089841
},
"harness|arc:challenge|25": {
"acc": 0.5870307167235495,
"acc_stderr": 0.014388344935398326,
"acc_norm": 0.6279863481228669,
"acc_norm_stderr": 0.014124597881844461
},
"harness|hellaswag|10": {
"acc": 0.6395140410276837,
"acc_stderr": 0.004791601975612765,
"acc_norm": 0.8403704441346346,
"acc_norm_stderr": 0.003655136111553706
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.04033565667848319,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.04033565667848319
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.030197611600197943,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.030197611600197943
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04016660030451233,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04016660030451233
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.0416656757710158,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.0416656757710158
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.024552292209342654,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.024552292209342654
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.024580028921480996,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.024580028921480996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.034867317274198714,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.034867317274198714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.0347327959083696,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.0347327959083696
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7772020725388601,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.7772020725388601,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.02493931390694079,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.02493931390694079
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547307,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547307
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.03142946637883708,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.03142946637883708
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.037579499229433426,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.037579499229433426
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7834862385321101,
"acc_stderr": 0.01765871059444313,
"acc_norm": 0.7834862385321101,
"acc_norm_stderr": 0.01765871059444313
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.033723432716530645,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.033723432716530645
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5419847328244275,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.5419847328244275,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.03731133519673893,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.03731133519673893
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.045218299028335865,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.045218299028335865
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.027236013946196697,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.027236013946196697
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7611749680715197,
"acc_stderr": 0.015246803197398675,
"acc_norm": 0.7611749680715197,
"acc_norm_stderr": 0.015246803197398675
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.638728323699422,
"acc_stderr": 0.025862201852277878,
"acc_norm": 0.638728323699422,
"acc_norm_stderr": 0.025862201852277878
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36312849162011174,
"acc_stderr": 0.016083749986853704,
"acc_norm": 0.36312849162011174,
"acc_norm_stderr": 0.016083749986853704
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5718954248366013,
"acc_stderr": 0.028332397483664274,
"acc_norm": 0.5718954248366013,
"acc_norm_stderr": 0.028332397483664274
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.02731684767419271,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.02731684767419271
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.025773111169630446,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.025773111169630446
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236844,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236844
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4745762711864407,
"acc_stderr": 0.012753716929101006,
"acc_norm": 0.4745762711864407,
"acc_norm_stderr": 0.012753716929101006
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.029768263528933105,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.029768263528933105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.019910377463105935,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.019910377463105935
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505417,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505417
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5551020408163265,
"acc_stderr": 0.03181425118197786,
"acc_norm": 0.5551020408163265,
"acc_norm_stderr": 0.03181425118197786
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.15920398009950248,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.15920398009950248,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32802937576499386,
"mc1_stderr": 0.016435632932815025,
"mc2": 0.4565749047878431,
"mc2_stderr": 0.015269312980089841
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
rikdas/fabric_dataset | 2023-09-14T08:37:27.000Z | [
"region:us"
] | rikdas | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 41259319.0
num_examples: 20
download_size: 41261924
dataset_size: 41259319.0
---
# Dataset Card for "fabric_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
YECHAN/processed_bert_dataset | 2023-09-14T08:31:17.000Z | [
"region:us"
] | YECHAN | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 173314800.0
num_examples: 48143
download_size: 41856821
dataset_size: 173314800.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "processed_bert_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Weyaxi__ChatAYT-Lora-Assamble-Marcoroni | 2023-09-14T08:41:07.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Weyaxi/ChatAYT-Lora-Assamble-Marcoroni
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/ChatAYT-Lora-Assamble-Marcoroni](https://huggingface.co/Weyaxi/ChatAYT-Lora-Assamble-Marcoroni)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__ChatAYT-Lora-Assamble-Marcoroni\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-14T08:39:51.722063](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__ChatAYT-Lora-Assamble-Marcoroni/blob/main/results_2023-09-14T08-39-51.722063.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5882764741172468,\n\
\ \"acc_stderr\": 0.03414095987700657,\n \"acc_norm\": 0.591952945933824,\n\
\ \"acc_norm_stderr\": 0.034120879574318176,\n \"mc1\": 0.40269277845777235,\n\
\ \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5612473537179327,\n\
\ \"mc2_stderr\": 0.015702862027941283\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6092150170648464,\n \"acc_stderr\": 0.014258563880513778,\n\
\ \"acc_norm\": 0.6245733788395904,\n \"acc_norm_stderr\": 0.014150631435111728\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6289583748257319,\n\
\ \"acc_stderr\": 0.004820962855749737,\n \"acc_norm\": 0.8305118502290381,\n\
\ \"acc_norm_stderr\": 0.0037441574425365596\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502706986,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502706986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332783,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7,\n \"acc_stderr\": 0.02606936229533514,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.02606936229533514\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932026,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932026\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164552,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164552\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5820512820512821,\n \"acc_stderr\": 0.02500732988246122,\n \
\ \"acc_norm\": 0.5820512820512821,\n \"acc_norm_stderr\": 0.02500732988246122\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.032016501007396114,\n\
\ \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.032016501007396114\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.763302752293578,\n \"acc_stderr\": 0.018224078117299102,\n \"\
acc_norm\": 0.763302752293578,\n \"acc_norm_stderr\": 0.018224078117299102\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4027777777777778,\n \"acc_stderr\": 0.033448873829978666,\n \"\
acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.033448873829978666\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.04039314978724561,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.04039314978724561\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912073,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912073\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724147,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724147\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.024161618127987745,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.024161618127987745\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n\
\ \"acc_stderr\": 0.01498727064094602,\n \"acc_norm\": 0.7726692209450831,\n\
\ \"acc_norm_stderr\": 0.01498727064094602\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.02599247202930639,\n\
\ \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.02599247202930639\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42569832402234636,\n\
\ \"acc_stderr\": 0.01653682964899711,\n \"acc_norm\": 0.42569832402234636,\n\
\ \"acc_norm_stderr\": 0.01653682964899711\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.02753007844711031,\n\
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.02753007844711031\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\
\ \"acc_stderr\": 0.02698147804364804,\n \"acc_norm\": 0.6559485530546624,\n\
\ \"acc_norm_stderr\": 0.02698147804364804\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02622964917882116,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02622964917882116\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.424380704041721,\n\
\ \"acc_stderr\": 0.012623343757430018,\n \"acc_norm\": 0.424380704041721,\n\
\ \"acc_norm_stderr\": 0.012623343757430018\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.03004261583271487,\n\
\ \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.03004261583271487\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5522875816993464,\n \"acc_stderr\": 0.020116925347422425,\n \
\ \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.020116925347422425\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117827,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117827\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40269277845777235,\n\
\ \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5612473537179327,\n\
\ \"mc2_stderr\": 0.015702862027941283\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/ChatAYT-Lora-Assamble-Marcoroni
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|arc:challenge|25_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hellaswag|10_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T08-39-51.722063.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T08-39-51.722063.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T08-39-51.722063.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T08-39-51.722063.parquet'
- config_name: results
data_files:
- split: 2023_09_14T08_39_51.722063
path:
- results_2023-09-14T08-39-51.722063.parquet
- split: latest
path:
- results_2023-09-14T08-39-51.722063.parquet
---
# Dataset Card for Evaluation run of Weyaxi/ChatAYT-Lora-Assamble-Marcoroni
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/ChatAYT-Lora-Assamble-Marcoroni
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/ChatAYT-Lora-Assamble-Marcoroni](https://huggingface.co/Weyaxi/ChatAYT-Lora-Assamble-Marcoroni) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__ChatAYT-Lora-Assamble-Marcoroni",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-14T08:39:51.722063](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__ChatAYT-Lora-Assamble-Marcoroni/blob/main/results_2023-09-14T08-39-51.722063.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5882764741172468,
"acc_stderr": 0.03414095987700657,
"acc_norm": 0.591952945933824,
"acc_norm_stderr": 0.034120879574318176,
"mc1": 0.40269277845777235,
"mc1_stderr": 0.01716883093518722,
"mc2": 0.5612473537179327,
"mc2_stderr": 0.015702862027941283
},
"harness|arc:challenge|25": {
"acc": 0.6092150170648464,
"acc_stderr": 0.014258563880513778,
"acc_norm": 0.6245733788395904,
"acc_norm_stderr": 0.014150631435111728
},
"harness|hellaswag|10": {
"acc": 0.6289583748257319,
"acc_stderr": 0.004820962855749737,
"acc_norm": 0.8305118502290381,
"acc_norm_stderr": 0.0037441574425365596
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04016660030451233,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04016660030451233
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502706986,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502706986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.024677862841332783,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.024677862841332783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7,
"acc_stderr": 0.02606936229533514,
"acc_norm": 0.7,
"acc_norm_stderr": 0.02606936229533514
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932026,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932026
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164552,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5820512820512821,
"acc_stderr": 0.02500732988246122,
"acc_norm": 0.5820512820512821,
"acc_norm_stderr": 0.02500732988246122
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.032016501007396114,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.032016501007396114
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.763302752293578,
"acc_stderr": 0.018224078117299102,
"acc_norm": 0.763302752293578,
"acc_norm_stderr": 0.018224078117299102
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.033448873829978666,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.033448873829978666
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808503,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808503
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.04039314978724561,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.04039314978724561
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912073,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912073
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724147,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724147
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.024161618127987745,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.024161618127987745
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.01498727064094602,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.01498727064094602
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.02599247202930639,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.02599247202930639
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42569832402234636,
"acc_stderr": 0.01653682964899711,
"acc_norm": 0.42569832402234636,
"acc_norm_stderr": 0.01653682964899711
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.02753007844711031,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.02753007844711031
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.02698147804364804,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.02698147804364804
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02622964917882116,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02622964917882116
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.424380704041721,
"acc_stderr": 0.012623343757430018,
"acc_norm": 0.424380704041721,
"acc_norm_stderr": 0.012623343757430018
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.03004261583271487,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.03004261583271487
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.020116925347422425,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.020116925347422425
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117827,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117827
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40269277845777235,
"mc1_stderr": 0.01716883093518722,
"mc2": 0.5612473537179327,
"mc2_stderr": 0.015702862027941283
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
rikdas/fabric_data | 2023-09-14T08:40:34.000Z | [
"region:us"
] | rikdas | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 41259319.0
num_examples: 20
download_size: 41261924
dataset_size: 41259319.0
---
# Dataset Card for "fabric_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Curvature/testdata | 2023-09-14T08:46:20.000Z | [
"license:other",
"region:us"
] | Curvature | null | null | null | 0 | 0 | ---
license: other
---
|
pigge/testsplat | 2023-10-02T10:59:02.000Z | [
"region:us"
] | pigge | null | null | null | 0 | 0 | Entry not found |
CyberHarem/shiraki_hime_watashinoyuriwaoshigotodesu | 2023-09-17T17:36:56.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Shiraki Hime
This is the dataset of Shiraki Hime, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 655 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 655 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 655 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 655 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/kurosaki_chitose_idolmastercinderellagirls | 2023-09-17T17:36:58.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kurosaki_chitose (THE iDOLM@STER: Cinderella Girls)
This is the dataset of kurosaki_chitose (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 517 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 517 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 517 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 517 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
nguyenphuthien/vi-alpaca-data | 2023-09-14T09:07:05.000Z | [
"license:mit",
"region:us"
] | nguyenphuthien | null | null | null | 0 | 0 | ---
license: mit
---
|
CyberHarem/yano_mitsuki_watashinoyuriwaoshigotodesu | 2023-09-17T17:37:01.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Yano Mitsuki
This is the dataset of Yano Mitsuki, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 643 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 643 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 643 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 643 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
kplgpt68/ASR_dataset | 2023-09-14T09:30:28.000Z | [
"region:us"
] | kplgpt68 | null | null | null | 0 | 0 | Entry not found |
CyberHarem/mamiya_kanoko_watashinoyuriwaoshigotodesu | 2023-09-17T17:37:03.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Mamiya Kanoko
This is the dataset of Mamiya Kanoko, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 672 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 672 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 672 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 672 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
JAYASWAROOP/mining_laws | 2023-09-14T09:51:05.000Z | [
"region:us"
] | JAYASWAROOP | null | null | null | 0 | 0 | Entry not found |
sabuhi1997/fine-tune-hebrew-dataset | 2023-09-14T10:40:44.000Z | [
"region:us"
] | sabuhi1997 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': test
'1': train
'2': validation
splits:
- name: train
num_bytes: 5714802.0
num_examples: 8
- name: validation
num_bytes: 1759819.0
num_examples: 3
- name: test
num_bytes: 1625529.0
num_examples: 4
download_size: 7719156
dataset_size: 9100150.0
---
# Dataset Card for "fine-tune-hebrew-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/sajou_yukimi_idolmastercinderellagirls | 2023-09-17T17:37:05.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sajou_yukimi (THE iDOLM@STER: Cinderella Girls)
This is the dataset of sajou_yukimi (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 519 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 519 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 519 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 519 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
lulblaka/midj | 2023-09-14T10:10:18.000Z | [
"region:us"
] | lulblaka | null | null | null | 0 | 0 | Entry not found |
CyberHarem/chibana_sumika_watashinoyuriwaoshigotodesu | 2023-09-17T17:37:07.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Chibana Sumika
This is the dataset of Chibana Sumika, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 681 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 681 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 681 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 681 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Satooo123/sd-poke | 2023-09-14T10:18:09.000Z | [
"region:us"
] | Satooo123 | null | null | null | 0 | 0 | Entry not found |
CyberHarem/shiina_mashiro_sakurasounopetnakanojo | 2023-09-17T17:37:09.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Shiina Mashiro
This is the dataset of Shiina Mashiro, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 688 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 688 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 688 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 688 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/koshiba_mai_watashinoyuriwaoshigotodesu | 2023-09-17T17:37:11.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Koshiba Mai
This is the dataset of Koshiba Mai, containing 220 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 220 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 507 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 220 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 220 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 220 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 220 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 220 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 507 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 507 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 507 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
mindchain/demo_23 | 2023-09-14T10:35:53.000Z | [
"region:us"
] | mindchain | null | null | null | 0 | 0 | Entry not found |
CortexFoundationOld/openvalidators | 2023-09-14T10:35:14.000Z | [
"region:us"
] | CortexFoundationOld | null | null | null | 0 | 0 | Entry not found |
mindchain/demo_24 | 2023-09-14T10:47:09.000Z | [
"region:us"
] | mindchain | null | null | null | 0 | 0 | Entry not found |
achieverprince/patabima-llama2-dataset | 2023-09-14T10:45:12.000Z | [
"region:us"
] | achieverprince | null | null | null | 0 | 0 | Entry not found |
mzh199163/bnb | 2023-09-14T10:50:15.000Z | [
"region:us"
] | mzh199163 | null | null | null | 0 | 0 | Entry not found |
CyberHarem/hayasaka_mirei_idolmastercinderellagirls | 2023-09-17T17:37:13.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hayasaka_mirei (THE iDOLM@STER: Cinderella Girls)
This is the dataset of hayasaka_mirei (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 540 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 540 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 540 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 540 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/aoyama_nanami_sakurasounopetnakanojo | 2023-09-17T17:37:15.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Aoyama Nanami
This is the dataset of Aoyama Nanami, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 709 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 709 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 709 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 709 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
achieverprince/patabima-llama2-dataset-v2 | 2023-09-14T10:56:20.000Z | [
"region:us"
] | achieverprince | null | null | null | 0 | 0 | Entry not found |
achieverprince/patabima-llama2-dataset-v3 | 2023-09-14T10:58:07.000Z | [
"region:us"
] | achieverprince | null | null | null | 0 | 0 | Entry not found |
achieverprince/patabima-llama2-dataset-v4 | 2023-09-14T11:01:40.000Z | [
"region:us"
] | achieverprince | null | null | null | 0 | 0 | Entry not found |
CyberHarem/kamiigusa_misaki_sakurasounopetnakanojo | 2023-09-17T17:37:17.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Kamiigusa Misaki
This is the dataset of Kamiigusa Misaki, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 725 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 725 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 725 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 725 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble | 2023-09-14T11:42:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of oh-yeontaek/llama-2-70B-LoRA-assemble
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [oh-yeontaek/llama-2-70B-LoRA-assemble](https://huggingface.co/oh-yeontaek/llama-2-70B-LoRA-assemble)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-14T11:41:03.022396](https://huggingface.co/datasets/open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble/blob/main/results_2023-09-14T11-41-03.022396.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6934330265245879,\n\
\ \"acc_stderr\": 0.031312838620430335,\n \"acc_norm\": 0.697335554746802,\n\
\ \"acc_norm_stderr\": 0.03128337547678218,\n \"mc1\": 0.46511627906976744,\n\
\ \"mc1_stderr\": 0.01746084997587397,\n \"mc2\": 0.6479539766332348,\n\
\ \"mc2_stderr\": 0.014916593992436448\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6851535836177475,\n \"acc_stderr\": 0.01357265770308495,\n\
\ \"acc_norm\": 0.7184300341296929,\n \"acc_norm_stderr\": 0.013143376735009022\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6707827126070504,\n\
\ \"acc_stderr\": 0.00468968597815517,\n \"acc_norm\": 0.867755427205736,\n\
\ \"acc_norm_stderr\": 0.0033806414709899157\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.03391160934343603,\n\
\ \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.03391160934343603\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.027134291628741702,\n\
\ \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.027134291628741702\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\
\ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\
\ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6638297872340425,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.6638297872340425,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.040287315329475576,\n\
\ \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.040287315329475576\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4656084656084656,\n \"acc_stderr\": 0.025690321762493844,\n \"\
acc_norm\": 0.4656084656084656,\n \"acc_norm_stderr\": 0.025690321762493844\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8290322580645161,\n\
\ \"acc_stderr\": 0.021417242936321582,\n \"acc_norm\": 0.8290322580645161,\n\
\ \"acc_norm_stderr\": 0.021417242936321582\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\"\
: 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8939393939393939,\n \"acc_stderr\": 0.021938047738853113,\n \"\
acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.021938047738853113\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6948717948717948,\n \"acc_stderr\": 0.023346335293325887,\n\
\ \"acc_norm\": 0.6948717948717948,\n \"acc_norm_stderr\": 0.023346335293325887\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.02720537153827947,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.02720537153827947\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248436,\n \"\
acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248436\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8862385321100917,\n \"acc_stderr\": 0.013613614800232805,\n \"\
acc_norm\": 0.8862385321100917,\n \"acc_norm_stderr\": 0.013613614800232805\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8970588235294118,\n \"acc_stderr\": 0.021328337570804365,\n \"\
acc_norm\": 0.8970588235294118,\n \"acc_norm_stderr\": 0.021328337570804365\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8734177215189873,\n \"acc_stderr\": 0.021644195727955173,\n \
\ \"acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.021644195727955173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.02799153425851952,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.02799153425851952\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744633,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744633\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.03004735765580663,\n\
\ \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.03004735765580663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8633461047254151,\n\
\ \"acc_stderr\": 0.012282876868629234,\n \"acc_norm\": 0.8633461047254151,\n\
\ \"acc_norm_stderr\": 0.012282876868629234\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7716763005780347,\n \"acc_stderr\": 0.022598703804321635,\n\
\ \"acc_norm\": 0.7716763005780347,\n \"acc_norm_stderr\": 0.022598703804321635\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5743016759776536,\n\
\ \"acc_stderr\": 0.01653682964899712,\n \"acc_norm\": 0.5743016759776536,\n\
\ \"acc_norm_stderr\": 0.01653682964899712\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7556270096463023,\n\
\ \"acc_stderr\": 0.024406162094668907,\n \"acc_norm\": 0.7556270096463023,\n\
\ \"acc_norm_stderr\": 0.024406162094668907\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7993827160493827,\n \"acc_stderr\": 0.02228231394977488,\n\
\ \"acc_norm\": 0.7993827160493827,\n \"acc_norm_stderr\": 0.02228231394977488\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5567375886524822,\n \"acc_stderr\": 0.029634838473766006,\n \
\ \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.029634838473766006\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5645371577574967,\n\
\ \"acc_stderr\": 0.012663412101248345,\n \"acc_norm\": 0.5645371577574967,\n\
\ \"acc_norm_stderr\": 0.012663412101248345\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.0265565194700415,\n\
\ \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.0265565194700415\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7418300653594772,\n \"acc_stderr\": 0.017704531653250078,\n \
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.017704531653250078\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.7545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866764,\n\
\ \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866764\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n\
\ \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n\
\ \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46511627906976744,\n\
\ \"mc1_stderr\": 0.01746084997587397,\n \"mc2\": 0.6479539766332348,\n\
\ \"mc2_stderr\": 0.014916593992436448\n }\n}\n```"
repo_url: https://huggingface.co/oh-yeontaek/llama-2-70B-LoRA-assemble
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|arc:challenge|25_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hellaswag|10_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T11-41-03.022396.parquet'
- config_name: results
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- results_2023-09-14T11-41-03.022396.parquet
- split: latest
path:
- results_2023-09-14T11-41-03.022396.parquet
---
# Dataset Card for Evaluation run of oh-yeontaek/llama-2-70B-LoRA-assemble
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/oh-yeontaek/llama-2-70B-LoRA-assemble
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [oh-yeontaek/llama-2-70B-LoRA-assemble](https://huggingface.co/oh-yeontaek/llama-2-70B-LoRA-assemble) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-14T11:41:03.022396](https://huggingface.co/datasets/open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble/blob/main/results_2023-09-14T11-41-03.022396.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6934330265245879,
"acc_stderr": 0.031312838620430335,
"acc_norm": 0.697335554746802,
"acc_norm_stderr": 0.03128337547678218,
"mc1": 0.46511627906976744,
"mc1_stderr": 0.01746084997587397,
"mc2": 0.6479539766332348,
"mc2_stderr": 0.014916593992436448
},
"harness|arc:challenge|25": {
"acc": 0.6851535836177475,
"acc_stderr": 0.01357265770308495,
"acc_norm": 0.7184300341296929,
"acc_norm_stderr": 0.013143376735009022
},
"harness|hellaswag|10": {
"acc": 0.6707827126070504,
"acc_stderr": 0.00468968597815517,
"acc_norm": 0.867755427205736,
"acc_norm_stderr": 0.0033806414709899157
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7763157894736842,
"acc_stderr": 0.03391160934343603,
"acc_norm": 0.7763157894736842,
"acc_norm_stderr": 0.03391160934343603
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7358490566037735,
"acc_stderr": 0.027134291628741702,
"acc_norm": 0.7358490566037735,
"acc_norm_stderr": 0.027134291628741702
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6638297872340425,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.6638297872340425,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.040287315329475576,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.040287315329475576
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4656084656084656,
"acc_stderr": 0.025690321762493844,
"acc_norm": 0.4656084656084656,
"acc_norm_stderr": 0.025690321762493844
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8290322580645161,
"acc_stderr": 0.021417242936321582,
"acc_norm": 0.8290322580645161,
"acc_norm_stderr": 0.021417242936321582
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8939393939393939,
"acc_stderr": 0.021938047738853113,
"acc_norm": 0.8939393939393939,
"acc_norm_stderr": 0.021938047738853113
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678178,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6948717948717948,
"acc_stderr": 0.023346335293325887,
"acc_norm": 0.6948717948717948,
"acc_norm_stderr": 0.023346335293325887
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.02720537153827947,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.02720537153827947
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8862385321100917,
"acc_stderr": 0.013613614800232805,
"acc_norm": 0.8862385321100917,
"acc_norm_stderr": 0.013613614800232805
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8970588235294118,
"acc_stderr": 0.021328337570804365,
"acc_norm": 0.8970588235294118,
"acc_norm_stderr": 0.021328337570804365
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8734177215189873,
"acc_stderr": 0.021644195727955173,
"acc_norm": 0.8734177215189873,
"acc_norm_stderr": 0.021644195727955173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.02799153425851952,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.02799153425851952
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744633,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744633
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.03004735765580663,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.03004735765580663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8633461047254151,
"acc_stderr": 0.012282876868629234,
"acc_norm": 0.8633461047254151,
"acc_norm_stderr": 0.012282876868629234
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7716763005780347,
"acc_stderr": 0.022598703804321635,
"acc_norm": 0.7716763005780347,
"acc_norm_stderr": 0.022598703804321635
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5743016759776536,
"acc_stderr": 0.01653682964899712,
"acc_norm": 0.5743016759776536,
"acc_norm_stderr": 0.01653682964899712
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7556270096463023,
"acc_stderr": 0.024406162094668907,
"acc_norm": 0.7556270096463023,
"acc_norm_stderr": 0.024406162094668907
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7993827160493827,
"acc_stderr": 0.02228231394977488,
"acc_norm": 0.7993827160493827,
"acc_norm_stderr": 0.02228231394977488
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5645371577574967,
"acc_stderr": 0.012663412101248345,
"acc_norm": 0.5645371577574967,
"acc_norm_stderr": 0.012663412101248345
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.0265565194700415,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.0265565194700415
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.017704531653250078,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.017704531653250078
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866764,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866764
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.46511627906976744,
"mc1_stderr": 0.01746084997587397,
"mc2": 0.6479539766332348,
"mc2_stderr": 0.014916593992436448
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
TamerlanW/SD_Lene_lubimoi | 2023-09-26T19:20:57.000Z | [
"region:us"
] | TamerlanW | null | null | null | 0 | 0 | Entry not found |
CyberHarem/yorita_yoshino_idolmastercinderellagirls | 2023-09-17T17:37:19.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yorita_yoshino (THE iDOLM@STER: Cinderella Girls)
This is the dataset of yorita_yoshino (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 541 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 541 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 541 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 541 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/sengoku_chihiro_sakurasounopetnakanojo | 2023-09-17T17:37:21.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Sengoku Chihiro
This is the dataset of Sengoku Chihiro, containing 71 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 71 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 170 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 71 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 71 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 71 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 71 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 71 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 170 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 170 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 170 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/mizumoto_yukari_idolmastercinderellagirls | 2023-09-17T17:37:23.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mizumoto_yukari (THE iDOLM@STER: Cinderella Girls)
This is the dataset of mizumoto_yukari (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 515 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 515 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 515 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 515 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
ai-nightcoder/Uzbek_LLM_llama | 2023-09-14T12:19:06.000Z | [
"license:llama2",
"region:us"
] | ai-nightcoder | null | null | null | 1 | 0 | ---
license: llama2
---
|
CyberHarem/rita_ainsworth_sakurasounopetnakanojo | 2023-09-17T17:37:25.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Rita Ainsworth
This is the dataset of Rita Ainsworth, containing 100 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 100 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 229 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 100 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 100 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 100 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 100 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 100 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 229 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 229 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 229 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
acumplido/HF_DATASET_NAME | 2023-09-14T13:34:20.000Z | [
"region:us"
] | acumplido | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Mikeer/llama2-papaer | 2023-09-14T12:32:27.000Z | [
"region:us"
] | Mikeer | null | null | null | 0 | 0 | Entry not found |
BangumiBase/edomaeelf | 2023-09-29T07:50:59.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Edomae Elf
This is the image base of bangumi Edomae Elf, we detected 16 characters, 1946 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 658 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 20 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 40 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 80 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 10 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 6 | [Download](5/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 6 | 734 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 79 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 14 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 13 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 94 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 58 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 32 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 21 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 11 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 76 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
edbeeching/godot_rl_AirHockey | 2023-09-14T12:42:38.000Z | [
"deep-reinforcement-learning",
"reinforcement-learning",
"godot-rl",
"environments",
"video-games",
"region:us"
] | edbeeching | null | null | null | 0 | 0 | ---
library_name: godot-rl
tags:
- deep-reinforcement-learning
- reinforcement-learning
- godot-rl
- environments
- video-games
---
A RL environment called AirHockey for the Godot Game Engine.
This environment was created with: https://github.com/edbeeching/godot_rl_agents
## Downloading the environment
After installing Godot RL Agents, download the environment with:
```
gdrl.env_from_hub -r edbeeching/godot_rl_AirHockey
```
|
hmxiong/ScanQA_Finetune | 2023-09-14T12:56:37.000Z | [
"region:us"
] | hmxiong | null | null | null | 0 | 0 | Entry not found |
bongo2112/moodewji-v3-SDxl-output-images | 2023-09-14T20:10:50.000Z | [
"region:us"
] | bongo2112 | null | null | null | 0 | 0 | Entry not found |
anjingdankucing/catdog | 2023-09-14T13:30:46.000Z | [
"region:us"
] | anjingdankucing | null | null | null | 0 | 0 | Entry not found |
shaowenchen/wiki_zh | 2023-09-15T05:43:18.000Z | [
"region:us"
] | shaowenchen | null | null | null | 0 | 0 | As of 2019.2.7 |
open-llm-leaderboard/details_wei123602__llama2-13b-fintune2-4E | 2023-09-14T13:47:10.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of wei123602/llama2-13b-fintune2-4E
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wei123602/llama2-13b-fintune2-4E](https://huggingface.co/wei123602/llama2-13b-fintune2-4E)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wei123602__llama2-13b-fintune2-4E\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-14T13:45:51.161008](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__llama2-13b-fintune2-4E/blob/main/results_2023-09-14T13-45-51.161008.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5383184272381838,\n\
\ \"acc_stderr\": 0.034850754300598634,\n \"acc_norm\": 0.5422336919250181,\n\
\ \"acc_norm_stderr\": 0.034833469783393314,\n \"mc1\": 0.29253365973072215,\n\
\ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.4272280203904765,\n\
\ \"mc2_stderr\": 0.015202924644061788\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5187713310580204,\n \"acc_stderr\": 0.014601090150633964,\n\
\ \"acc_norm\": 0.5588737201365188,\n \"acc_norm_stderr\": 0.014509747749064663\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6186018721370244,\n\
\ \"acc_stderr\": 0.004847372670134645,\n \"acc_norm\": 0.8095000995817566,\n\
\ \"acc_norm_stderr\": 0.003918928556590478\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480863,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480863\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779205,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779205\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.03040233144576954,\n\
\ \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.03040233144576954\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.49710982658959535,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.49710982658959535,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.02494236893115979,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.02494236893115979\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.567741935483871,\n\
\ \"acc_stderr\": 0.028181739720019416,\n \"acc_norm\": 0.567741935483871,\n\
\ \"acc_norm_stderr\": 0.028181739720019416\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.39408866995073893,\n \"acc_stderr\": 0.034381579670365446,\n\
\ \"acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.034381579670365446\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.037131580674819135,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.037131580674819135\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6818181818181818,\n \"acc_stderr\": 0.03318477333845331,\n \"\
acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.03318477333845331\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7668393782383419,\n \"acc_stderr\": 0.03051611137147601,\n\
\ \"acc_norm\": 0.7668393782383419,\n \"acc_norm_stderr\": 0.03051611137147601\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.541025641025641,\n \"acc_stderr\": 0.025265525491284295,\n \
\ \"acc_norm\": 0.541025641025641,\n \"acc_norm_stderr\": 0.025265525491284295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948492,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.03243718055137411,\n \
\ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.03243718055137411\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.726605504587156,\n \"acc_stderr\": 0.0191092998460983,\n \"acc_norm\"\
: 0.726605504587156,\n \"acc_norm_stderr\": 0.0191092998460983\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.03362277436608043,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.03362277436608043\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.03132179803083291,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.03132179803083291\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.04369236326573981,\n \"\
acc_norm\": 0.6446280991735537,\n \"acc_norm_stderr\": 0.04369236326573981\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6319018404907976,\n \"acc_stderr\": 0.03789213935838396,\n\
\ \"acc_norm\": 0.6319018404907976,\n \"acc_norm_stderr\": 0.03789213935838396\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.02685345037700916,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.02685345037700916\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7139208173690932,\n\
\ \"acc_stderr\": 0.01616087140512754,\n \"acc_norm\": 0.7139208173690932,\n\
\ \"acc_norm_stderr\": 0.01616087140512754\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6098265895953757,\n \"acc_stderr\": 0.026261677607806636,\n\
\ \"acc_norm\": 0.6098265895953757,\n \"acc_norm_stderr\": 0.026261677607806636\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29497206703910617,\n\
\ \"acc_stderr\": 0.015251931579208173,\n \"acc_norm\": 0.29497206703910617,\n\
\ \"acc_norm_stderr\": 0.015251931579208173\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5718954248366013,\n \"acc_stderr\": 0.02833239748366428,\n\
\ \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.02833239748366428\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n\
\ \"acc_stderr\": 0.027210420375934023,\n \"acc_norm\": 0.6430868167202572,\n\
\ \"acc_norm_stderr\": 0.027210420375934023\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.027431623722415012,\n\
\ \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.027431623722415012\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291484,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291484\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4315514993481095,\n\
\ \"acc_stderr\": 0.012650007999463881,\n \"acc_norm\": 0.4315514993481095,\n\
\ \"acc_norm_stderr\": 0.012650007999463881\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5473856209150327,\n \"acc_stderr\": 0.020136790918492534,\n \
\ \"acc_norm\": 0.5473856209150327,\n \"acc_norm_stderr\": 0.020136790918492534\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5469387755102041,\n \"acc_stderr\": 0.03186785930004129,\n\
\ \"acc_norm\": 0.5469387755102041,\n \"acc_norm_stderr\": 0.03186785930004129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\
\ \"acc_stderr\": 0.03235743789355043,\n \"acc_norm\": 0.7014925373134329,\n\
\ \"acc_norm_stderr\": 0.03235743789355043\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29253365973072215,\n\
\ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.4272280203904765,\n\
\ \"mc2_stderr\": 0.015202924644061788\n }\n}\n```"
repo_url: https://huggingface.co/wei123602/llama2-13b-fintune2-4E
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|arc:challenge|25_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hellaswag|10_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T13-45-51.161008.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T13-45-51.161008.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T13-45-51.161008.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T13-45-51.161008.parquet'
- config_name: results
data_files:
- split: 2023_09_14T13_45_51.161008
path:
- results_2023-09-14T13-45-51.161008.parquet
- split: latest
path:
- results_2023-09-14T13-45-51.161008.parquet
---
# Dataset Card for Evaluation run of wei123602/llama2-13b-fintune2-4E
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/wei123602/llama2-13b-fintune2-4E
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [wei123602/llama2-13b-fintune2-4E](https://huggingface.co/wei123602/llama2-13b-fintune2-4E) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wei123602__llama2-13b-fintune2-4E",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-14T13:45:51.161008](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__llama2-13b-fintune2-4E/blob/main/results_2023-09-14T13-45-51.161008.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5383184272381838,
"acc_stderr": 0.034850754300598634,
"acc_norm": 0.5422336919250181,
"acc_norm_stderr": 0.034833469783393314,
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.4272280203904765,
"mc2_stderr": 0.015202924644061788
},
"harness|arc:challenge|25": {
"acc": 0.5187713310580204,
"acc_stderr": 0.014601090150633964,
"acc_norm": 0.5588737201365188,
"acc_norm_stderr": 0.014509747749064663
},
"harness|hellaswag|10": {
"acc": 0.6186018721370244,
"acc_stderr": 0.004847372670134645,
"acc_norm": 0.8095000995817566,
"acc_norm_stderr": 0.003918928556590478
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480863,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480863
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779205,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779205
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5773584905660377,
"acc_stderr": 0.03040233144576954,
"acc_norm": 0.5773584905660377,
"acc_norm_stderr": 0.03040233144576954
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.02494236893115979,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.02494236893115979
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.567741935483871,
"acc_stderr": 0.028181739720019416,
"acc_norm": 0.567741935483871,
"acc_norm_stderr": 0.028181739720019416
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.034381579670365446,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.034381579670365446
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.037131580674819135,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.037131580674819135
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.03318477333845331,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.03318477333845331
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7668393782383419,
"acc_stderr": 0.03051611137147601,
"acc_norm": 0.7668393782383419,
"acc_norm_stderr": 0.03051611137147601
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.541025641025641,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.541025641025641,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948492,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.03243718055137411,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.03243718055137411
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.726605504587156,
"acc_stderr": 0.0191092998460983,
"acc_norm": 0.726605504587156,
"acc_norm_stderr": 0.0191092998460983
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.03132179803083291,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.03132179803083291
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6319018404907976,
"acc_stderr": 0.03789213935838396,
"acc_norm": 0.6319018404907976,
"acc_norm_stderr": 0.03789213935838396
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.02685345037700916,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.02685345037700916
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7139208173690932,
"acc_stderr": 0.01616087140512754,
"acc_norm": 0.7139208173690932,
"acc_norm_stderr": 0.01616087140512754
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6098265895953757,
"acc_stderr": 0.026261677607806636,
"acc_norm": 0.6098265895953757,
"acc_norm_stderr": 0.026261677607806636
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29497206703910617,
"acc_stderr": 0.015251931579208173,
"acc_norm": 0.29497206703910617,
"acc_norm_stderr": 0.015251931579208173
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5718954248366013,
"acc_stderr": 0.02833239748366428,
"acc_norm": 0.5718954248366013,
"acc_norm_stderr": 0.02833239748366428
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.027210420375934023,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.027210420375934023
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.027431623722415012,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.027431623722415012
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291484,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291484
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4315514993481095,
"acc_stderr": 0.012650007999463881,
"acc_norm": 0.4315514993481095,
"acc_norm_stderr": 0.012650007999463881
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5473856209150327,
"acc_stderr": 0.020136790918492534,
"acc_norm": 0.5473856209150327,
"acc_norm_stderr": 0.020136790918492534
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5469387755102041,
"acc_stderr": 0.03186785930004129,
"acc_norm": 0.5469387755102041,
"acc_norm_stderr": 0.03186785930004129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355043,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355043
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.4272280203904765,
"mc2_stderr": 0.015202924644061788
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_wei123602__llama2-13b-FINETUNE3_TEST | 2023-09-14T13:49:44.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of wei123602/llama2-13b-FINETUNE3_TEST
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wei123602/llama2-13b-FINETUNE3_TEST](https://huggingface.co/wei123602/llama2-13b-FINETUNE3_TEST)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wei123602__llama2-13b-FINETUNE3_TEST\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-14T13:48:26.265439](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__llama2-13b-FINETUNE3_TEST/blob/main/results_2023-09-14T13-48-26.265439.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5444895368748129,\n\
\ \"acc_stderr\": 0.03478801872078259,\n \"acc_norm\": 0.5489775910790865,\n\
\ \"acc_norm_stderr\": 0.03477242589701758,\n \"mc1\": 0.27539779681762544,\n\
\ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.40223962650004846,\n\
\ \"mc2_stderr\": 0.014370809574399035\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.47952218430034127,\n \"acc_stderr\": 0.014599131353035004,\n\
\ \"acc_norm\": 0.5366894197952219,\n \"acc_norm_stderr\": 0.014572000527756989\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5889265086636128,\n\
\ \"acc_stderr\": 0.004910229643262737,\n \"acc_norm\": 0.7965544712208723,\n\
\ \"acc_norm_stderr\": 0.004017383866405767\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.0404633688397825,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.0404633688397825\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.030197611600197946,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.030197611600197946\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3931034482758621,\n \"acc_stderr\": 0.040703290137070705,\n\
\ \"acc_norm\": 0.3931034482758621,\n \"acc_norm_stderr\": 0.040703290137070705\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3201058201058201,\n \"acc_stderr\": 0.024026846392873506,\n \"\
acc_norm\": 0.3201058201058201,\n \"acc_norm_stderr\": 0.024026846392873506\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6225806451612903,\n\
\ \"acc_stderr\": 0.027575960723278246,\n \"acc_norm\": 0.6225806451612903,\n\
\ \"acc_norm_stderr\": 0.027575960723278246\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0368105086916155,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0368105086916155\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6767676767676768,\n \"acc_stderr\": 0.03332299921070644,\n \"\
acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.03332299921070644\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548047,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548047\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5692307692307692,\n \"acc_stderr\": 0.02510682066053975,\n \
\ \"acc_norm\": 0.5692307692307692,\n \"acc_norm_stderr\": 0.02510682066053975\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.02763490726417854,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.02763490726417854\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413926,\n \
\ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413926\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7889908256880734,\n \"acc_stderr\": 0.01749392240411265,\n \"\
acc_norm\": 0.7889908256880734,\n \"acc_norm_stderr\": 0.01749392240411265\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n\
\ \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n\
\ \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262971,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262971\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.04453197507374983,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.04453197507374983\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6257668711656442,\n \"acc_stderr\": 0.03802068102899616,\n\
\ \"acc_norm\": 0.6257668711656442,\n \"acc_norm_stderr\": 0.03802068102899616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291518,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291518\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.0458212416016155,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.0458212416016155\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n\
\ \"acc_stderr\": 0.024904439098918228,\n \"acc_norm\": 0.8247863247863247,\n\
\ \"acc_norm_stderr\": 0.024904439098918228\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.719029374201788,\n\
\ \"acc_stderr\": 0.016073127851221225,\n \"acc_norm\": 0.719029374201788,\n\
\ \"acc_norm_stderr\": 0.016073127851221225\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5722543352601156,\n \"acc_stderr\": 0.026636539741116086,\n\
\ \"acc_norm\": 0.5722543352601156,\n \"acc_norm_stderr\": 0.026636539741116086\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3128491620111732,\n\
\ \"acc_stderr\": 0.015506892594647274,\n \"acc_norm\": 0.3128491620111732,\n\
\ \"acc_norm_stderr\": 0.015506892594647274\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5816993464052288,\n \"acc_stderr\": 0.028245134024387296,\n\
\ \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.028245134024387296\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6270096463022508,\n\
\ \"acc_stderr\": 0.0274666102131401,\n \"acc_norm\": 0.6270096463022508,\n\
\ \"acc_norm_stderr\": 0.0274666102131401\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027125115513166858,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027125115513166858\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.02883892147125146,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.02883892147125146\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n\
\ \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n\
\ \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5845588235294118,\n \"acc_stderr\": 0.02993534270787774,\n\
\ \"acc_norm\": 0.5845588235294118,\n \"acc_norm_stderr\": 0.02993534270787774\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5522875816993464,\n \"acc_stderr\": 0.020116925347422425,\n \
\ \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.020116925347422425\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5428571428571428,\n \"acc_stderr\": 0.031891418324213966,\n\
\ \"acc_norm\": 0.5428571428571428,\n \"acc_norm_stderr\": 0.031891418324213966\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n\
\ \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.6716417910447762,\n\
\ \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03377310252209205,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03377310252209205\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27539779681762544,\n\
\ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.40223962650004846,\n\
\ \"mc2_stderr\": 0.014370809574399035\n }\n}\n```"
repo_url: https://huggingface.co/wei123602/llama2-13b-FINETUNE3_TEST
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|arc:challenge|25_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hellaswag|10_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T13-48-26.265439.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T13-48-26.265439.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T13-48-26.265439.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T13-48-26.265439.parquet'
- config_name: results
data_files:
- split: 2023_09_14T13_48_26.265439
path:
- results_2023-09-14T13-48-26.265439.parquet
- split: latest
path:
- results_2023-09-14T13-48-26.265439.parquet
---
# Dataset Card for Evaluation run of wei123602/llama2-13b-FINETUNE3_TEST
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/wei123602/llama2-13b-FINETUNE3_TEST
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [wei123602/llama2-13b-FINETUNE3_TEST](https://huggingface.co/wei123602/llama2-13b-FINETUNE3_TEST) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wei123602__llama2-13b-FINETUNE3_TEST",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-14T13:48:26.265439](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__llama2-13b-FINETUNE3_TEST/blob/main/results_2023-09-14T13-48-26.265439.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5444895368748129,
"acc_stderr": 0.03478801872078259,
"acc_norm": 0.5489775910790865,
"acc_norm_stderr": 0.03477242589701758,
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.40223962650004846,
"mc2_stderr": 0.014370809574399035
},
"harness|arc:challenge|25": {
"acc": 0.47952218430034127,
"acc_stderr": 0.014599131353035004,
"acc_norm": 0.5366894197952219,
"acc_norm_stderr": 0.014572000527756989
},
"harness|hellaswag|10": {
"acc": 0.5889265086636128,
"acc_stderr": 0.004910229643262737,
"acc_norm": 0.7965544712208723,
"acc_norm_stderr": 0.004017383866405767
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.0404633688397825,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.0404633688397825
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.030197611600197946,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.030197611600197946
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5625,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.043391383225798615,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.043391383225798615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3931034482758621,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.3931034482758621,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.024026846392873506,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.024026846392873506
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.027575960723278246,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.027575960723278246
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0368105086916155,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0368105086916155
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.03332299921070644,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.03332299921070644
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548047,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548047
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5692307692307692,
"acc_stderr": 0.02510682066053975,
"acc_norm": 0.5692307692307692,
"acc_norm_stderr": 0.02510682066053975
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.02763490726417854,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.02763490726417854
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.03181110032413926,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.03181110032413926
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7889908256880734,
"acc_stderr": 0.01749392240411265,
"acc_norm": 0.7889908256880734,
"acc_norm_stderr": 0.01749392240411265
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262971,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262971
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.04453197507374983,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.04453197507374983
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6257668711656442,
"acc_stderr": 0.03802068102899616,
"acc_norm": 0.6257668711656442,
"acc_norm_stderr": 0.03802068102899616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291518,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291518
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.0458212416016155,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.0458212416016155
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8247863247863247,
"acc_stderr": 0.024904439098918228,
"acc_norm": 0.8247863247863247,
"acc_norm_stderr": 0.024904439098918228
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.719029374201788,
"acc_stderr": 0.016073127851221225,
"acc_norm": 0.719029374201788,
"acc_norm_stderr": 0.016073127851221225
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.026636539741116086,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.026636539741116086
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3128491620111732,
"acc_stderr": 0.015506892594647274,
"acc_norm": 0.3128491620111732,
"acc_norm_stderr": 0.015506892594647274
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.028245134024387296,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.028245134024387296
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6270096463022508,
"acc_stderr": 0.0274666102131401,
"acc_norm": 0.6270096463022508,
"acc_norm_stderr": 0.0274666102131401
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027125115513166858,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027125115513166858
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.02883892147125146,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.02883892147125146
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5845588235294118,
"acc_stderr": 0.02993534270787774,
"acc_norm": 0.5845588235294118,
"acc_norm_stderr": 0.02993534270787774
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.020116925347422425,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.020116925347422425
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5428571428571428,
"acc_stderr": 0.031891418324213966,
"acc_norm": 0.5428571428571428,
"acc_norm_stderr": 0.031891418324213966
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6716417910447762,
"acc_stderr": 0.033206858897443244,
"acc_norm": 0.6716417910447762,
"acc_norm_stderr": 0.033206858897443244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03377310252209205,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03377310252209205
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.40223962650004846,
"mc2_stderr": 0.014370809574399035
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_wei123602__llama2-13b-FINETUNE3_TEST2 | 2023-09-14T13:52:52.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of wei123602/llama2-13b-FINETUNE3_TEST2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wei123602/llama2-13b-FINETUNE3_TEST2](https://huggingface.co/wei123602/llama2-13b-FINETUNE3_TEST2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wei123602__llama2-13b-FINETUNE3_TEST2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-14T13:51:34.438102](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__llama2-13b-FINETUNE3_TEST2/blob/main/results_2023-09-14T13-51-34.438102.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5677513161504792,\n\
\ \"acc_stderr\": 0.034667527806544376,\n \"acc_norm\": 0.5718243148374196,\n\
\ \"acc_norm_stderr\": 0.03464965050195055,\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608767,\n \"mc2\": 0.399328906923071,\n\
\ \"mc2_stderr\": 0.01423505140557656\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5136518771331058,\n \"acc_stderr\": 0.014605943429860945,\n\
\ \"acc_norm\": 0.5469283276450512,\n \"acc_norm_stderr\": 0.014546892052005628\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6077474606652061,\n\
\ \"acc_stderr\": 0.004872546302641852,\n \"acc_norm\": 0.8147779326827326,\n\
\ \"acc_norm_stderr\": 0.0038768367094611364\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04046336883978251,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04046336883978251\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137285,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137285\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5516129032258065,\n\
\ \"acc_stderr\": 0.028292056830112735,\n \"acc_norm\": 0.5516129032258065,\n\
\ \"acc_norm_stderr\": 0.028292056830112735\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438803,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438803\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624335,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624335\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5538461538461539,\n \"acc_stderr\": 0.02520357177302833,\n \
\ \"acc_norm\": 0.5538461538461539,\n \"acc_norm_stderr\": 0.02520357177302833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.031693802357129965,\n\
\ \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.031693802357129965\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8,\n \"acc_stderr\": 0.017149858514250958,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.017149858514250958\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.03395322726375798,\n\
\ \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.03395322726375798\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928276,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928276\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.04373313040914761,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.04373313040914761\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.02645350805404033,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.02645350805404033\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7637292464878672,\n\
\ \"acc_stderr\": 0.015190473717037495,\n \"acc_norm\": 0.7637292464878672,\n\
\ \"acc_norm_stderr\": 0.015190473717037495\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5982658959537572,\n \"acc_stderr\": 0.026394104177643634,\n\
\ \"acc_norm\": 0.5982658959537572,\n \"acc_norm_stderr\": 0.026394104177643634\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39553072625698327,\n\
\ \"acc_stderr\": 0.016353415410075775,\n \"acc_norm\": 0.39553072625698327,\n\
\ \"acc_norm_stderr\": 0.016353415410075775\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.02827549015679146,\n\
\ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.02827549015679146\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.027098652621301757,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.027098652621301757\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6450617283950617,\n \"acc_stderr\": 0.02662415247884585,\n\
\ \"acc_norm\": 0.6450617283950617,\n \"acc_norm_stderr\": 0.02662415247884585\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778855,\n \
\ \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778855\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43089960886571055,\n\
\ \"acc_stderr\": 0.012647695889547231,\n \"acc_norm\": 0.43089960886571055,\n\
\ \"acc_norm_stderr\": 0.012647695889547231\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5833333333333334,\n \"acc_stderr\": 0.01994491413687358,\n \
\ \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.01994491413687358\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5918367346938775,\n \"acc_stderr\": 0.03146465712827424,\n\
\ \"acc_norm\": 0.5918367346938775,\n \"acc_norm_stderr\": 0.03146465712827424\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608767,\n \"mc2\": 0.399328906923071,\n\
\ \"mc2_stderr\": 0.01423505140557656\n }\n}\n```"
repo_url: https://huggingface.co/wei123602/llama2-13b-FINETUNE3_TEST2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|arc:challenge|25_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hellaswag|10_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T13-51-34.438102.parquet'
- config_name: results
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- results_2023-09-14T13-51-34.438102.parquet
- split: latest
path:
- results_2023-09-14T13-51-34.438102.parquet
---
# Dataset Card for Evaluation run of wei123602/llama2-13b-FINETUNE3_TEST2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/wei123602/llama2-13b-FINETUNE3_TEST2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [wei123602/llama2-13b-FINETUNE3_TEST2](https://huggingface.co/wei123602/llama2-13b-FINETUNE3_TEST2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wei123602__llama2-13b-FINETUNE3_TEST2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-14T13:51:34.438102](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__llama2-13b-FINETUNE3_TEST2/blob/main/results_2023-09-14T13-51-34.438102.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5677513161504792,
"acc_stderr": 0.034667527806544376,
"acc_norm": 0.5718243148374196,
"acc_norm_stderr": 0.03464965050195055,
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608767,
"mc2": 0.399328906923071,
"mc2_stderr": 0.01423505140557656
},
"harness|arc:challenge|25": {
"acc": 0.5136518771331058,
"acc_stderr": 0.014605943429860945,
"acc_norm": 0.5469283276450512,
"acc_norm_stderr": 0.014546892052005628
},
"harness|hellaswag|10": {
"acc": 0.6077474606652061,
"acc_stderr": 0.004872546302641852,
"acc_norm": 0.8147779326827326,
"acc_norm_stderr": 0.0038768367094611364
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04046336883978251,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04046336883978251
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.03019761160019795,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.03019761160019795
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137285,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137285
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5516129032258065,
"acc_stderr": 0.028292056830112735,
"acc_norm": 0.5516129032258065,
"acc_norm_stderr": 0.028292056830112735
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03481904844438803,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03481904844438803
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624335,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624335
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5538461538461539,
"acc_stderr": 0.02520357177302833,
"acc_norm": 0.5538461538461539,
"acc_norm_stderr": 0.02520357177302833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253252,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253252
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6092436974789915,
"acc_stderr": 0.031693802357129965,
"acc_norm": 0.6092436974789915,
"acc_norm_stderr": 0.031693802357129965
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.017149858514250958,
"acc_norm": 0.8,
"acc_norm_stderr": 0.017149858514250958
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.03395322726375798,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.03395322726375798
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928276,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.04373313040914761,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.04373313040914761
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.02645350805404033,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.02645350805404033
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7637292464878672,
"acc_stderr": 0.015190473717037495,
"acc_norm": 0.7637292464878672,
"acc_norm_stderr": 0.015190473717037495
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5982658959537572,
"acc_stderr": 0.026394104177643634,
"acc_norm": 0.5982658959537572,
"acc_norm_stderr": 0.026394104177643634
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39553072625698327,
"acc_stderr": 0.016353415410075775,
"acc_norm": 0.39553072625698327,
"acc_norm_stderr": 0.016353415410075775
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.02827549015679146,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.02827549015679146
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.027098652621301757,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.027098652621301757
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6450617283950617,
"acc_stderr": 0.02662415247884585,
"acc_norm": 0.6450617283950617,
"acc_norm_stderr": 0.02662415247884585
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.029555454236778855,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.029555454236778855
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43089960886571055,
"acc_stderr": 0.012647695889547231,
"acc_norm": 0.43089960886571055,
"acc_norm_stderr": 0.012647695889547231
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5625,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.01994491413687358,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.01994491413687358
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5918367346938775,
"acc_stderr": 0.03146465712827424,
"acc_norm": 0.5918367346938775,
"acc_norm_stderr": 0.03146465712827424
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608767,
"mc2": 0.399328906923071,
"mc2_stderr": 0.01423505140557656
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/takafuji_kako_idolmastercinderellagirls | 2023-09-17T17:37:27.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of takafuji_kako (THE iDOLM@STER: Cinderella Girls)
This is the dataset of takafuji_kako (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 511 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 511 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 511 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 511 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/mukai_takumi_idolmastercinderellagirls | 2023-09-17T17:37:29.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mukai_takumi (THE iDOLM@STER: Cinderella Girls)
This is the dataset of mukai_takumi (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 539 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 539 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 539 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 539 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
loubnabnl/notebook | 2023-09-14T14:25:52.000Z | [
"region:us"
] | loubnabnl | null | null | null | 0 | 0 | Entry not found |
wennie/huan | 2023-09-14T14:17:40.000Z | [
"region:us"
] | wennie | null | null | null | 0 | 0 | Entry not found |
metabloit/offensive-swahili-text | 2023-09-14T14:33:52.000Z | [
"task_categories:text-classification",
"size_categories:1K<n<10K",
"language:sw",
"license:mit",
"region:us"
] | metabloit | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-classification
language:
- sw
size_categories:
- 1K<n<10K
viewer: true
---
# Overview
This dataset contains offensive and non-offensive sentences. The data was scraped from JamiiForums using a prepared wordlist.
The dataset contains sentences that consists of swahili abusive words (in the wordlist) but does not contain sarcastic abuse.
## Dataset details
The dataset is divided into train, evaluation and test datasets. The training dataset consists of 4954 sentences, evaluation dataset
consists of 990 sentences and the test dataset consists of 660 sentences.
### Dataset annotations
- 0: non-offensive
- 1: offensive
|
Chung2023/Stable-diffusion-photo | 2023-09-14T14:25:05.000Z | [
"region:us"
] | Chung2023 | null | null | null | 0 | 0 | Entry not found |
KnutJaegersberg/orca-wizardlm-v1-clustered | 2023-09-14T14:39:25.000Z | [
"license:cc-by-nc-4.0",
"region:us"
] | KnutJaegersberg | null | null | null | 0 | 0 | ---
license: cc-by-nc-4.0
---
This dataset uses the system instructions from the orca-wizardlm dataset, but attaches the original evolved instruction responses and two cluster solutions.
The dolphin clusters are clusters for the instructions only, using sgpt approach with the dolphin-7b model.
The GTE clusters are clusters for the instructions and responses, as rough topic mapping.
I use these cluster solutions to downsample the dataset by topic, whilst preserving clusters of 'evolved instructions'. |
KnutJaegersberg/dolphin_orca_clustered | 2023-09-14T15:24:42.000Z | [
"license:cc-by-nc-4.0",
"region:us"
] | KnutJaegersberg | null | null | null | 0 | 0 | ---
license: cc-by-nc-4.0
---
This dataset is the gpt-4 generated subset of the dolphin / orca dataset, with clusters from:
- gte embeddings
- dolphin-7b sgpt embeddings
- bigbird document embeddings (simsce sentence transformer based, for the whole document) |
hezhaoqia/sd | 2023-09-14T15:04:08.000Z | [
"region:us"
] | hezhaoqia | null | null | null | 0 | 0 | Entry not found |
MichaelPape/timer-dataset | 2023-09-15T20:24:54.000Z | [
"license:apache-2.0",
"region:us"
] | MichaelPape | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
Lechcher/images_for_my_test_website | 2023-09-14T15:29:24.000Z | [
"license:openrail",
"region:us"
] | Lechcher | null | null | null | 0 | 0 | ---
license: openrail
---
|
yachay/text_coordinates_seasons | 2023-09-22T13:28:15.000Z | [
"task_categories:feature-extraction",
"task_categories:token-classification",
"task_categories:text-classification",
"size_categories:100M<n<1B",
"language:en",
"language:es",
"language:ru",
"language:co",
"language:ar",
"language:fa",
"license:mit",
"multilingual",
"text",
"coordinates",
... | yachay | null | null | null | 1 | 0 | ---
license: mit
tags:
- multilingual
- text
- coordinates
- geospatial
- translation
- NER
- geo
- geo-tagged
- named-entity-recognition
- natural-language-processing
- geographic-data
- geolocation
- twitter
- reddit
task_categories:
- feature-extraction
- token-classification
- text-classification
pretty_name: Geo-Tagged Social Media Posts with Timestamps
language:
- en
- es
- ru
- co
- ar
- fa
size_categories:
- 100M<n<1B
---
# Dataset Card for Geo-Tagged Social Media Posts with Timestamps
## Dataset Description
- **Homepage:** https://huggingface.co/datasets/yachay/text_coordinates_seasons
- **Repository:** https://github.com/Yachay-AI/byt5-geotagging#datasets
- **Paper:** https://dev.to/yachayai/applying-machine-learning-to-geolocate-twitter-posts-2m1d
- **Leaderboard:**
- **Point of Contact:** admin-team@yachay.ai
### Dataset Summary
The "Seasons" dataset is a collection of over 600,000 social media posts spanning 12 months and encompassing 15 distinct time zones. It focuses on six countries: **Cuba, Iran, Russia, North Korea, Syria, and Venezuela,** with each post containing textual content, timestamps, and geographical coordinates. The dataset's primary objective is to investigate the correlation between the timing of posts, their content, and the geographical locations. Researchers can leverage this dataset to advance studies in geospatial NLP and gain insights into how temporal factors and seasonality impact the results.
### Supported Tasks and Leaderboards
This dataset is well-suited for tasks such as geotagging, where the objective is to associate text with specific geographical locations. It can also be utilized for geolocation analysis, sentiment analysis in regional contexts, and regional text classification.
### Languages
**Multilingual Dataset**
Mainly contains English, Spanish, Persian, Russian, Korean, and Arabic.
## Dataset Structure
### Data Instances
The "Seasons" dataset consists of over 600,000 data instances, each representing a social media post.
### Data Fields
**Text (text):** This field contains the textual content.
**Timestamp (created_at):** The dataset includes timestamps to track the exact time when each social media post was created. Timestamps are recorded in Unix epoch time format.
**Geographical Coordinates (geo_geo_bbox):** This field contains geocoordinates that describe the geographical location associated with each social media post. These coordinates are represented as latitude and longitude ranges in a bounding box format.
```json
{
"text": "sample text",
"geo_geo_bbox": "[-67.220209, 9.934294, -65.428322, 10.6496277]"
},
{
"created_at": {
"$numberLong": "1633049378000"
}
```
### Data Splits
This dataset is not pre-partitioned into training, validation, and test data splits, providing flexibility for users to split the data according to their specific research or application needs. Users can customize the data partitioning to suit their machine learning experiments and analytical requirements.
## Dataset Creation
### Curation Rationale
The "Seasons" dataset was created with an objective to advancing research in NLP by investigating the intricate relationships between temporal factors, content, and author location in social media posts. This dataset was assembled to provide a resource for understanding how time zones and seasonal events influence the model's results.
### Source Data
#### Initial Data Collection and Normalization
The initial data collection process focused on gathering geotagged comments from social media platforms, with a primary emphasis on Twitter.
#### Who are the source language producers?
Twitter Community
### Annotations
#### Annotation process
The coordinates in this dataset have been derived from metadata sources.
#### Who are the annotators?
No manual annotation was conducted for this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
The "Seasons" dataset has a potential to enhance our understanding of the intricate relationship between temporal dynamics, content, and location in social media posts.
### Discussion of Biases
It's essential to acknowledge that the data collected from social media platforms may contain inherent biases, influenced by user demographics and platform dynamics. Researchers should be mindful of these biases and consider potential implications in their analyses.
### Other Known Limitations
- The dataset's multilingual nature may lead to varying data quality and linguistic diversity across regions.
- The use of geotagged social media comments means that the dataset may not cover less active or less represented regions/seasons.
- The accuracy of geocoordinates is subject to inherent limitations of the data sources used for collection.
## Additional Information
### Dataset Curators
Yachay AI
### Licensing Information
MIT |
Henil1/sanskrit | 2023-09-14T15:32:13.000Z | [
"region:us"
] | Henil1 | null | null | null | 0 | 0 | Entry not found |
yachay/text_coordinates_regions | 2023-09-21T16:19:16.000Z | [
"task_categories:feature-extraction",
"task_categories:token-classification",
"task_categories:text-classification",
"size_categories:100M<n<1B",
"language:en",
"language:zh",
"language:es",
"language:hi",
"language:ar",
"language:bn",
"language:pt",
"language:ru",
"language:ja",
"language... | yachay | null | null | null | 1 | 0 | ---
license: mit
tags:
- multilingual
- text
- coordinates
- geospatial
- translation
- NER
- geo
- geo-tagged
- named-entity-recognition
- natural-language-processing
- geographic-data
- geolocation
- twitter
- reddit
task_categories:
- feature-extraction
- token-classification
- text-classification
pretty_name: Multilingual Geo-Tagged Social Media Posts (by 123 world regions)
language:
- en
- zh
- es
- hi
- ar
- bn
- pt
- ru
- ja
- pa
- de
- jv
- ms
- te
- vi
- ko
- fr
- mr
- ta
- ur
- tr
- it
- th
- gu
- fa
- pl
size_categories:
- 100M<n<1B
---
# Dataset Card for Multilingual Geo-Tagged Social Media Posts (by 123 world regions)
## Dataset Description
- **Homepage:** https://huggingface.co/datasets/yachay/text_coordinates_regions
- **Repository:** https://github.com/Yachay-AI/byt5-geotagging#datasets
- **Paper:** https://dev.to/yachayai/applying-machine-learning-to-geolocate-twitter-posts-2m1d
- **Leaderboard:**
- **Point of Contact:** admin-team@yachay.ai
### Dataset Summary
The "Regions" dataset is a multilingual corpus that encompasses textual data from the 123 most populated regions worldwide, with each region's data organized into separate .json files. This dataset consists of approximately 500,000 text samples, each paired with its geographic coordinates.
**Key Features:**
- **Textual Data:** The dataset contains 500,000 text samples.
- **Geocoordinates:** Each text sample is associated with geocoordinates.
- **Regional Coverage:** The dataset encompasses 123 of the world's most populated regions.
- **Tweet Data:** Within each region, there are 5,000 individual tweets/comments.
### Supported Tasks and Leaderboards
This dataset is well-suited for tasks such as geotagging, where the objective is to associate text with specific geographical locations. It can also be utilized for geolocation analysis, sentiment analysis in regional contexts, and regional text classification.
### Languages
**Multilingual Dataset**
This dataset is multilingual and contains text data in various languages from around the world. It does not have a fixed set of languages, and the language composition may vary across different versions or updates of the dataset.
## Dataset Structure
**Structure and Naming Convention:**
The naming convention for the JSON files follows the format "c_0.json" to "c_122.json," where "c_" represents the region category followed by a unique identifier
```bash
/
├── .gitattributes
├── README.md
├── c_0.json # Each .json file attributes to one of 123 regions
├── c_1.json
├── ...
├── c_122.json
```
### Data Instances
The Regions dataset consists of a total of 500,000 data instances, with each instance comprising a text sample and its associated geocoordinates. These instances are distributed across the 123 in each json file.
### Data Fields
**Text (text):** This field contains the text sample, typically holds natural language text data, such as comments, tweets, or any text-based content.
**Coordinates (coordinates):** This field includes geographical coordinates, latitude and longitude, providing the geographic location associated with the text.
```json
{
"text": "sample text",
"coordinates": [
"-75.04057630341867",
"40.01714225600481"
]
}
```
### Data Splits
This dataset is not pre-partitioned into training, validation, and test data splits, providing flexibility for users to split the data according to their specific research or application needs. Users can customize the data partitioning to suit their machine learning experiments and analytical requirements.
## Dataset Creation
2021
### Curation Rationale
The "Regions" dataset was created with an objective to train and enhance geotagging textual models. With 500,000 text samples, each paired with geocoordinates, it offers a resource for developing models that can associate text with specific geographical locations. Whether for geolocation analysis or other tasks merging text and geographic information, this dataset serves as a valuable training tool.
### Source Data
#### Initial Data Collection and Normalization
The initial data collection process focused on gathering geotagged comments from social media platforms, with a primary emphasis on Twitter.
#### Who are the source language producers?
Twitter Community
### Annotations
#### Annotation process
The coordinates in this dataset have been derived from metadata sources.
#### Who are the annotators?
No manual annotation was conducted for this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
The "Regions" dataset, with its multilingual text and geographic coordinates, presents an opportunity to advance research in geospatial NLP. However, it is crucial for users to exercise caution and ethical responsibility when handling location-related data to mitigate any potential privacy concerns or misuse.
### Discussion of Biases
It's essential to acknowledge that the data collected from social media platforms may contain inherent biases, influenced by user demographics and platform dynamics. Researchers should be mindful of these biases and consider potential implications in their analyses.
### Other Known Limitations
- The dataset's multilingual nature may lead to varying data quality and linguistic diversity across regions.
- The use of geotagged social media comments means that the dataset may not cover less active or less represented regions.
- The accuracy of geocoordinates is subject to inherent limitations of the data sources used for collection.
## Additional Information
### Dataset Curators
Yachay AI
### Licensing Information
MIT |
CyberHarem/yusa_kozue_idolmastercinderellagirls | 2023-09-17T17:37:32.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yusa_kozue (THE iDOLM@STER: Cinderella Girls)
This is the dataset of yusa_kozue (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 512 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 512 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 512 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 512 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/himekawa_yuki_idolmastercinderellagirls | 2023-09-17T17:37:34.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of himekawa_yuki (THE iDOLM@STER: Cinderella Girls)
This is the dataset of himekawa_yuki (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 556 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 556 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 556 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 556 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Dmente/Purga | 2023-09-14T15:52:49.000Z | [
"region:us"
] | Dmente | null | null | null | 0 | 0 | Entry not found |
CyberHarem/fujiwara_hajime_idolmastercinderellagirls | 2023-09-17T17:37:36.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of fujiwara_hajime (THE iDOLM@STER: Cinderella Girls)
This is the dataset of fujiwara_hajime (THE iDOLM@STER: Cinderella Girls), containing 38 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 38 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 90 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 38 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 38 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 38 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 38 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 38 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 90 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 90 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 90 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
dubinthebox/movies | 2023-09-14T16:26:33.000Z | [
"license:mit",
"region:us"
] | dubinthebox | null | null | null | 0 | 0 | ---
license: mit
---
|
VictorGil75/taller | 2023-09-14T16:34:41.000Z | [
"region:us"
] | VictorGil75 | null | null | null | 0 | 0 | Entry not found |
MingLiiii/cherry_wizardlm_filtered | 2023-09-14T17:20:44.000Z | [
"region:us"
] | MingLiiii | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: data
struct:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 118756239
num_examples: 63655
download_size: 62560682
dataset_size: 118756239
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cherry_wizardlm_filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/ryuuzaki_kaoru_idolmastercinderellagirls | 2023-09-17T17:37:38.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ryuuzaki_kaoru (THE iDOLM@STER: Cinderella Girls)
This is the dataset of ryuuzaki_kaoru (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 537 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 537 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 537 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 537 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
ZacharySkanHF/Ian-Mccollum-AI-250-Epochs | 2023-09-15T01:50:35.000Z | [
"region:us"
] | ZacharySkanHF | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_sauce1337__BerrySauce-L2-13b | 2023-09-14T17:31:16.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of sauce1337/BerrySauce-L2-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [sauce1337/BerrySauce-L2-13b](https://huggingface.co/sauce1337/BerrySauce-L2-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sauce1337__BerrySauce-L2-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-14T17:29:56.581892](https://huggingface.co/datasets/open-llm-leaderboard/details_sauce1337__BerrySauce-L2-13b/blob/main/results_2023-09-14T17-29-56.581892.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5726196706877625,\n\
\ \"acc_stderr\": 0.03429118889656096,\n \"acc_norm\": 0.5764359267876852,\n\
\ \"acc_norm_stderr\": 0.034269405377698306,\n \"mc1\": 0.3427172582619339,\n\
\ \"mc1_stderr\": 0.016614949385347036,\n \"mc2\": 0.48300146678836864,\n\
\ \"mc2_stderr\": 0.015515296488170974\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5972696245733788,\n \"acc_stderr\": 0.014332236306790149,\n\
\ \"acc_norm\": 0.6228668941979523,\n \"acc_norm_stderr\": 0.0141633668961926\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6382194781915953,\n\
\ \"acc_stderr\": 0.004795337009118202,\n \"acc_norm\": 0.8377813184624576,\n\
\ \"acc_norm_stderr\": 0.003678978806819641\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04046336883978251,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04046336883978251\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364396,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364396\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.02422996529842507,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.02422996529842507\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n\
\ \"acc_stderr\": 0.02659308451657226,\n \"acc_norm\": 0.6774193548387096,\n\
\ \"acc_norm_stderr\": 0.02659308451657226\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.036462049632538115,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.036462049632538115\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139744,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139744\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5384615384615384,\n \"acc_stderr\": 0.025275892070240637,\n\
\ \"acc_norm\": 0.5384615384615384,\n \"acc_norm_stderr\": 0.025275892070240637\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.032219436365661956,\n\
\ \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.032219436365661956\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7577981651376147,\n \"acc_stderr\": 0.01836817630659862,\n \"\
acc_norm\": 0.7577981651376147,\n \"acc_norm_stderr\": 0.01836817630659862\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.782051282051282,\n\
\ \"acc_stderr\": 0.027046857630716684,\n \"acc_norm\": 0.782051282051282,\n\
\ \"acc_norm_stderr\": 0.027046857630716684\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n\
\ \"acc_stderr\": 0.015246803197398674,\n \"acc_norm\": 0.7611749680715197,\n\
\ \"acc_norm_stderr\": 0.015246803197398674\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.025574123786546665,\n\
\ \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.025574123786546665\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43910614525139663,\n\
\ \"acc_stderr\": 0.016598022120580425,\n \"acc_norm\": 0.43910614525139663,\n\
\ \"acc_norm_stderr\": 0.016598022120580425\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02782610930728369,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02782610930728369\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n\
\ \"acc_stderr\": 0.027210420375934023,\n \"acc_norm\": 0.6430868167202572,\n\
\ \"acc_norm_stderr\": 0.027210420375934023\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.02646248777700187,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.02646248777700187\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.029525914302558555,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.029525914302558555\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.424380704041721,\n\
\ \"acc_stderr\": 0.012623343757430018,\n \"acc_norm\": 0.424380704041721,\n\
\ \"acc_norm_stderr\": 0.012623343757430018\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5845588235294118,\n \"acc_stderr\": 0.029935342707877753,\n\
\ \"acc_norm\": 0.5845588235294118,\n \"acc_norm_stderr\": 0.029935342707877753\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5784313725490197,\n \"acc_stderr\": 0.019977422600227474,\n \
\ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.019977422600227474\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.7611940298507462,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3427172582619339,\n\
\ \"mc1_stderr\": 0.016614949385347036,\n \"mc2\": 0.48300146678836864,\n\
\ \"mc2_stderr\": 0.015515296488170974\n }\n}\n```"
repo_url: https://huggingface.co/sauce1337/BerrySauce-L2-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|arc:challenge|25_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hellaswag|10_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T17-29-56.581892.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T17-29-56.581892.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T17-29-56.581892.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T17-29-56.581892.parquet'
- config_name: results
data_files:
- split: 2023_09_14T17_29_56.581892
path:
- results_2023-09-14T17-29-56.581892.parquet
- split: latest
path:
- results_2023-09-14T17-29-56.581892.parquet
---
# Dataset Card for Evaluation run of sauce1337/BerrySauce-L2-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/sauce1337/BerrySauce-L2-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [sauce1337/BerrySauce-L2-13b](https://huggingface.co/sauce1337/BerrySauce-L2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sauce1337__BerrySauce-L2-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-14T17:29:56.581892](https://huggingface.co/datasets/open-llm-leaderboard/details_sauce1337__BerrySauce-L2-13b/blob/main/results_2023-09-14T17-29-56.581892.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5726196706877625,
"acc_stderr": 0.03429118889656096,
"acc_norm": 0.5764359267876852,
"acc_norm_stderr": 0.034269405377698306,
"mc1": 0.3427172582619339,
"mc1_stderr": 0.016614949385347036,
"mc2": 0.48300146678836864,
"mc2_stderr": 0.015515296488170974
},
"harness|arc:challenge|25": {
"acc": 0.5972696245733788,
"acc_stderr": 0.014332236306790149,
"acc_norm": 0.6228668941979523,
"acc_norm_stderr": 0.0141633668961926
},
"harness|hellaswag|10": {
"acc": 0.6382194781915953,
"acc_stderr": 0.004795337009118202,
"acc_norm": 0.8377813184624576,
"acc_norm_stderr": 0.003678978806819641
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04046336883978251,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04046336883978251
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364396,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364396
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.02422996529842507,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.02422996529842507
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.02659308451657226,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.02659308451657226
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.036462049632538115,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.036462049632538115
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.02649905770139744,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.02649905770139744
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5384615384615384,
"acc_stderr": 0.025275892070240637,
"acc_norm": 0.5384615384615384,
"acc_norm_stderr": 0.025275892070240637
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.032219436365661956,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.032219436365661956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7577981651376147,
"acc_stderr": 0.01836817630659862,
"acc_norm": 0.7577981651376147,
"acc_norm_stderr": 0.01836817630659862
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908706,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908706
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.782051282051282,
"acc_stderr": 0.027046857630716684,
"acc_norm": 0.782051282051282,
"acc_norm_stderr": 0.027046857630716684
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7611749680715197,
"acc_stderr": 0.015246803197398674,
"acc_norm": 0.7611749680715197,
"acc_norm_stderr": 0.015246803197398674
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.025574123786546665,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.025574123786546665
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43910614525139663,
"acc_stderr": 0.016598022120580425,
"acc_norm": 0.43910614525139663,
"acc_norm_stderr": 0.016598022120580425
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02782610930728369,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02782610930728369
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.027210420375934023,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.027210420375934023
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.02646248777700187,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.02646248777700187
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.029525914302558555,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.029525914302558555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.424380704041721,
"acc_stderr": 0.012623343757430018,
"acc_norm": 0.424380704041721,
"acc_norm_stderr": 0.012623343757430018
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5845588235294118,
"acc_stderr": 0.029935342707877753,
"acc_norm": 0.5845588235294118,
"acc_norm_stderr": 0.029935342707877753
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.019977422600227474,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.019977422600227474
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3427172582619339,
"mc1_stderr": 0.016614949385347036,
"mc2": 0.48300146678836864,
"mc2_stderr": 0.015515296488170974
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
hfllm/tech | 2023-09-14T17:36:40.000Z | [
"region:us"
] | hfllm | null | null | null | 0 | 0 | Entry not found |
patrickvonplaten/muse_images | 2023-09-14T17:51:53.000Z | [
"region:us"
] | patrickvonplaten | null | null | null | 0 | 0 | Entry not found |
alpayariyak/corpus_1_embedded_deduplicated | 2023-09-14T17:47:55.000Z | [
"region:us"
] | alpayariyak | null | null | null | 0 | 0 | Entry not found |
victor/outicon | 2023-09-14T17:55:37.000Z | [
"region:us"
] | victor | null | null | null | 0 | 0 | Entry not found |
CyberHarem/sasaki_chie_idolmastercinderellagirls | 2023-09-17T17:37:40.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sasaki_chie (THE iDOLM@STER: Cinderella Girls)
This is the dataset of sasaki_chie (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 535 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 535 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 535 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 535 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Pillonneau/test | 2023-09-14T18:01:31.000Z | [
"license:apache-2.0",
"region:us"
] | Pillonneau | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
CyberHarem/shiina_noriko_idolmastercinderellagirls | 2023-09-17T17:37:42.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shiina_noriko (THE iDOLM@STER: Cinderella Girls)
This is the dataset of shiina_noriko (THE iDOLM@STER: Cinderella Girls), containing 111 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 111 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 293 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 111 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 111 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 111 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 111 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 111 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 293 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 293 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 293 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
TuningAI/Cover_letter_v1 | 2023-09-14T18:06:47.000Z | [
"region:us"
] | TuningAI | null | null | null | 0 | 0 | Entry not found |
TuningAI/Startups_V1 | 2023-09-15T13:56:28.000Z | [
"region:us"
] | TuningAI | null | null | null | 0 | 0 | Entry not found |
icantiemyshoe/shodan-io-rag-dpr | 2023-09-14T18:07:35.000Z | [
"license:bsd-3-clause",
"region:us"
] | icantiemyshoe | null | null | null | 0 | 0 | ---
license: bsd-3-clause
---
|
Transzvuk/piling-equipment | 2023-10-03T20:09:52.000Z | [
"doi:10.57967/hf/1180",
"region:us"
] | Transzvuk | null | null | null | 0 | 0 | ---
pretty_name: piles
---
---
license: openrail
task_categories:
- image-segmentation
- object-detection
language:
- en
- ru
size_categories:
- n<1K |
approach0/annotated-topics-perfect | 2023-09-14T18:28:24.000Z | [
"region:us"
] | approach0 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: problem
dtype: string
- name: query
dtype: string
- name: prompt
dtype: string
- name: solution
dtype: string
- name: ground_truth
dtype: 'null'
- name: judge_buffer
dtype: 'null'
- name: manual_query
dtype: 'null'
- name: manual_rating
dtype: int64
- name: args
dtype: string
- name: out_str
dtype: string
- name: tool_res
sequence: string
splits:
- name: test
num_bytes: 73300
num_examples: 9
download_size: 56937
dataset_size: 73300
---
# Dataset Card for "annotated-topic-perfect"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
approach0/annotated-topics-good | 2023-09-14T18:29:16.000Z | [
"region:us"
] | approach0 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: problem
dtype: string
- name: query
dtype: string
- name: prompt
dtype: string
- name: solution
dtype: string
- name: ground_truth
dtype: 'null'
- name: judge_buffer
dtype: 'null'
- name: manual_query
dtype: 'null'
- name: manual_rating
dtype: int64
- name: args
dtype: string
- name: out_str
dtype: string
- name: tool_res
sequence: string
splits:
- name: test
num_bytes: 289031
num_examples: 41
download_size: 101510
dataset_size: 289031
---
# Dataset Card for "annotated-topic-good"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
approach0/no-asy-precalculus-topics-by-queryLM | 2023-09-14T18:33:42.000Z | [
"region:us"
] | approach0 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: src_path
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
- name: out_str
dtype: string
- name: tool_res
sequence: string
splits:
- name: test
num_bytes: 2199344
num_examples: 392
download_size: 675379
dataset_size: 2199344
---
# Dataset Card for "no-asy-precalculus-topics"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/miyoshi_sana_idolmastercinderellagirls | 2023-09-17T17:37:44.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of miyoshi_sana (THE iDOLM@STER: Cinderella Girls)
This is the dataset of miyoshi_sana (THE iDOLM@STER: Cinderella Girls), containing 90 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 90 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 243 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 90 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 90 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 90 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 90 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 90 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 243 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 243 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 243 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_ajibawa-2023__Uncensored-Frank-7B | 2023-09-14T18:48:05.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ajibawa-2023/Uncensored-Frank-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ajibawa-2023/Uncensored-Frank-7B](https://huggingface.co/ajibawa-2023/Uncensored-Frank-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ajibawa-2023__Uncensored-Frank-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-14T18:46:51.372002](https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Uncensored-Frank-7B/blob/main/results_2023-09-14T18-46-51.372002.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3806262300772237,\n\
\ \"acc_stderr\": 0.03465067219069186,\n \"acc_norm\": 0.38446769124315544,\n\
\ \"acc_norm_stderr\": 0.03463804114424542,\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.4385958916369525,\n\
\ \"mc2_stderr\": 0.015588485121300084\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4991467576791809,\n \"acc_stderr\": 0.014611369529813276,\n\
\ \"acc_norm\": 0.5426621160409556,\n \"acc_norm_stderr\": 0.014558106543924068\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.582055367456682,\n\
\ \"acc_stderr\": 0.0049221295689195815,\n \"acc_norm\": 0.7651862178848835,\n\
\ \"acc_norm_stderr\": 0.004230160814469385\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40131578947368424,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.40131578947368424,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4226415094339623,\n \"acc_stderr\": 0.030402331445769537,\n\
\ \"acc_norm\": 0.4226415094339623,\n \"acc_norm_stderr\": 0.030402331445769537\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n\
\ \"acc_stderr\": 0.040329990539607195,\n \"acc_norm\": 0.3680555555555556,\n\
\ \"acc_norm_stderr\": 0.040329990539607195\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.30057803468208094,\n\
\ \"acc_stderr\": 0.03496101481191181,\n \"acc_norm\": 0.30057803468208094,\n\
\ \"acc_norm_stderr\": 0.03496101481191181\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309993,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309993\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948365,\n \"\
acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948365\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3580645161290323,\n\
\ \"acc_stderr\": 0.027273890594300645,\n \"acc_norm\": 0.3580645161290323,\n\
\ \"acc_norm_stderr\": 0.027273890594300645\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694433,\n\
\ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694433\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4727272727272727,\n \"acc_stderr\": 0.03898531605579419,\n\
\ \"acc_norm\": 0.4727272727272727,\n \"acc_norm_stderr\": 0.03898531605579419\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.41919191919191917,\n \"acc_stderr\": 0.035155207286704175,\n \"\
acc_norm\": 0.41919191919191917,\n \"acc_norm_stderr\": 0.035155207286704175\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.46632124352331605,\n \"acc_stderr\": 0.03600244069867178,\n\
\ \"acc_norm\": 0.46632124352331605,\n \"acc_norm_stderr\": 0.03600244069867178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3153846153846154,\n \"acc_stderr\": 0.02355964698318995,\n \
\ \"acc_norm\": 0.3153846153846154,\n \"acc_norm_stderr\": 0.02355964698318995\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895992,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895992\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31932773109243695,\n \"acc_stderr\": 0.030283995525884396,\n\
\ \"acc_norm\": 0.31932773109243695,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23178807947019867,\n \"acc_stderr\": 0.03445406271987053,\n \"\
acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.03445406271987053\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.44770642201834865,\n \"acc_stderr\": 0.021319754962425455,\n \"\
acc_norm\": 0.44770642201834865,\n \"acc_norm_stderr\": 0.021319754962425455\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.19907407407407407,\n \"acc_stderr\": 0.02723229846269023,\n \"\
acc_norm\": 0.19907407407407407,\n \"acc_norm_stderr\": 0.02723229846269023\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5098039215686274,\n \"acc_stderr\": 0.03508637358630573,\n \"\
acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.03508637358630573\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4641350210970464,\n \"acc_stderr\": 0.03246338898055659,\n \
\ \"acc_norm\": 0.4641350210970464,\n \"acc_norm_stderr\": 0.03246338898055659\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.484304932735426,\n\
\ \"acc_stderr\": 0.0335412657542081,\n \"acc_norm\": 0.484304932735426,\n\
\ \"acc_norm_stderr\": 0.0335412657542081\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.35877862595419846,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.35877862595419846,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5785123966942148,\n \"acc_stderr\": 0.04507732278775087,\n \"\
acc_norm\": 0.5785123966942148,\n \"acc_norm_stderr\": 0.04507732278775087\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.047128212574267705,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.047128212574267705\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3496932515337423,\n \"acc_stderr\": 0.037466683254700206,\n\
\ \"acc_norm\": 0.3496932515337423,\n \"acc_norm_stderr\": 0.037466683254700206\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952686,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952686\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.36893203883495146,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.36893203883495146,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5598290598290598,\n\
\ \"acc_stderr\": 0.0325207417206305,\n \"acc_norm\": 0.5598290598290598,\n\
\ \"acc_norm_stderr\": 0.0325207417206305\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.01776925058353325,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.01776925058353325\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.41040462427745666,\n \"acc_stderr\": 0.02648339204209818,\n\
\ \"acc_norm\": 0.41040462427745666,\n \"acc_norm_stderr\": 0.02648339204209818\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23687150837988827,\n\
\ \"acc_stderr\": 0.014219570788103984,\n \"acc_norm\": 0.23687150837988827,\n\
\ \"acc_norm_stderr\": 0.014219570788103984\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3758169934640523,\n \"acc_stderr\": 0.02773283435336394,\n\
\ \"acc_norm\": 0.3758169934640523,\n \"acc_norm_stderr\": 0.02773283435336394\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.36012861736334406,\n\
\ \"acc_stderr\": 0.02726429759980401,\n \"acc_norm\": 0.36012861736334406,\n\
\ \"acc_norm_stderr\": 0.02726429759980401\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.027513747284379424,\n\
\ \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.027513747284379424\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.30141843971631205,\n \"acc_stderr\": 0.02737412888263115,\n \
\ \"acc_norm\": 0.30141843971631205,\n \"acc_norm_stderr\": 0.02737412888263115\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.29465449804432853,\n\
\ \"acc_stderr\": 0.011643576764069553,\n \"acc_norm\": 0.29465449804432853,\n\
\ \"acc_norm_stderr\": 0.011643576764069553\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47794117647058826,\n \"acc_stderr\": 0.03034326422421352,\n\
\ \"acc_norm\": 0.47794117647058826,\n \"acc_norm_stderr\": 0.03034326422421352\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.38235294117647056,\n \"acc_stderr\": 0.019659922493623336,\n \
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.019659922493623336\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.45454545454545453,\n\
\ \"acc_stderr\": 0.04769300568972743,\n \"acc_norm\": 0.45454545454545453,\n\
\ \"acc_norm_stderr\": 0.04769300568972743\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.31020408163265306,\n \"acc_stderr\": 0.029613459872484378,\n\
\ \"acc_norm\": 0.31020408163265306,\n \"acc_norm_stderr\": 0.029613459872484378\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.48756218905472637,\n\
\ \"acc_stderr\": 0.03534439848539579,\n \"acc_norm\": 0.48756218905472637,\n\
\ \"acc_norm_stderr\": 0.03534439848539579\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.03834234744164993,\n\
\ \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.03834234744164993\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.4385958916369525,\n\
\ \"mc2_stderr\": 0.015588485121300084\n }\n}\n```"
repo_url: https://huggingface.co/ajibawa-2023/Uncensored-Frank-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|arc:challenge|25_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hellaswag|10_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T18-46-51.372002.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T18-46-51.372002.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T18-46-51.372002.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T18-46-51.372002.parquet'
- config_name: results
data_files:
- split: 2023_09_14T18_46_51.372002
path:
- results_2023-09-14T18-46-51.372002.parquet
- split: latest
path:
- results_2023-09-14T18-46-51.372002.parquet
---
# Dataset Card for Evaluation run of ajibawa-2023/Uncensored-Frank-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ajibawa-2023/Uncensored-Frank-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ajibawa-2023/Uncensored-Frank-7B](https://huggingface.co/ajibawa-2023/Uncensored-Frank-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ajibawa-2023__Uncensored-Frank-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-14T18:46:51.372002](https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Uncensored-Frank-7B/blob/main/results_2023-09-14T18-46-51.372002.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3806262300772237,
"acc_stderr": 0.03465067219069186,
"acc_norm": 0.38446769124315544,
"acc_norm_stderr": 0.03463804114424542,
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236614,
"mc2": 0.4385958916369525,
"mc2_stderr": 0.015588485121300084
},
"harness|arc:challenge|25": {
"acc": 0.4991467576791809,
"acc_stderr": 0.014611369529813276,
"acc_norm": 0.5426621160409556,
"acc_norm_stderr": 0.014558106543924068
},
"harness|hellaswag|10": {
"acc": 0.582055367456682,
"acc_stderr": 0.0049221295689195815,
"acc_norm": 0.7651862178848835,
"acc_norm_stderr": 0.004230160814469385
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40131578947368424,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.40131578947368424,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4226415094339623,
"acc_stderr": 0.030402331445769537,
"acc_norm": 0.4226415094339623,
"acc_norm_stderr": 0.030402331445769537
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3680555555555556,
"acc_stderr": 0.040329990539607195,
"acc_norm": 0.3680555555555556,
"acc_norm_stderr": 0.040329990539607195
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.30057803468208094,
"acc_stderr": 0.03496101481191181,
"acc_norm": 0.30057803468208094,
"acc_norm_stderr": 0.03496101481191181
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.022182037202948365,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.022182037202948365
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235173,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3580645161290323,
"acc_stderr": 0.027273890594300645,
"acc_norm": 0.3580645161290323,
"acc_norm_stderr": 0.027273890594300645
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694433,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694433
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.03898531605579419,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.03898531605579419
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.41919191919191917,
"acc_stderr": 0.035155207286704175,
"acc_norm": 0.41919191919191917,
"acc_norm_stderr": 0.035155207286704175
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.46632124352331605,
"acc_stderr": 0.03600244069867178,
"acc_norm": 0.46632124352331605,
"acc_norm_stderr": 0.03600244069867178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3153846153846154,
"acc_stderr": 0.02355964698318995,
"acc_norm": 0.3153846153846154,
"acc_norm_stderr": 0.02355964698318995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895992,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895992
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31932773109243695,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.31932773109243695,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.03445406271987053,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.03445406271987053
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.44770642201834865,
"acc_stderr": 0.021319754962425455,
"acc_norm": 0.44770642201834865,
"acc_norm_stderr": 0.021319754962425455
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.19907407407407407,
"acc_stderr": 0.02723229846269023,
"acc_norm": 0.19907407407407407,
"acc_norm_stderr": 0.02723229846269023
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.03508637358630573,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.03508637358630573
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4641350210970464,
"acc_stderr": 0.03246338898055659,
"acc_norm": 0.4641350210970464,
"acc_norm_stderr": 0.03246338898055659
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.484304932735426,
"acc_stderr": 0.0335412657542081,
"acc_norm": 0.484304932735426,
"acc_norm_stderr": 0.0335412657542081
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.35877862595419846,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.35877862595419846,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5785123966942148,
"acc_stderr": 0.04507732278775087,
"acc_norm": 0.5785123966942148,
"acc_norm_stderr": 0.04507732278775087
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.047128212574267705,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.047128212574267705
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3496932515337423,
"acc_stderr": 0.037466683254700206,
"acc_norm": 0.3496932515337423,
"acc_norm_stderr": 0.037466683254700206
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952686,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952686
},
"harness|hendrycksTest-management|5": {
"acc": 0.36893203883495146,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.36893203883495146,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5598290598290598,
"acc_stderr": 0.0325207417206305,
"acc_norm": 0.5598290598290598,
"acc_norm_stderr": 0.0325207417206305
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.01776925058353325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.01776925058353325
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.02648339204209818,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.02648339204209818
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23687150837988827,
"acc_stderr": 0.014219570788103984,
"acc_norm": 0.23687150837988827,
"acc_norm_stderr": 0.014219570788103984
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3758169934640523,
"acc_stderr": 0.02773283435336394,
"acc_norm": 0.3758169934640523,
"acc_norm_stderr": 0.02773283435336394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.36012861736334406,
"acc_stderr": 0.02726429759980401,
"acc_norm": 0.36012861736334406,
"acc_norm_stderr": 0.02726429759980401
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.027513747284379424,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.027513747284379424
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.30141843971631205,
"acc_stderr": 0.02737412888263115,
"acc_norm": 0.30141843971631205,
"acc_norm_stderr": 0.02737412888263115
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.29465449804432853,
"acc_stderr": 0.011643576764069553,
"acc_norm": 0.29465449804432853,
"acc_norm_stderr": 0.011643576764069553
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47794117647058826,
"acc_stderr": 0.03034326422421352,
"acc_norm": 0.47794117647058826,
"acc_norm_stderr": 0.03034326422421352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.019659922493623336,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.019659922493623336
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.04769300568972743,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.04769300568972743
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.31020408163265306,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.31020408163265306,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.48756218905472637,
"acc_stderr": 0.03534439848539579,
"acc_norm": 0.48756218905472637,
"acc_norm_stderr": 0.03534439848539579
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.0362933532994786,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.0362933532994786
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.03834234744164993,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.03834234744164993
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236614,
"mc2": 0.4385958916369525,
"mc2_stderr": 0.015588485121300084
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/fujimoto_rina_idolmastercinderellagirls | 2023-09-17T17:37:46.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of fujimoto_rina (THE iDOLM@STER: Cinderella Girls)
This is the dataset of fujimoto_rina (THE iDOLM@STER: Cinderella Girls), containing 88 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 88 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 236 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 88 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 88 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 88 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 88 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 88 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 236 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 236 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 236 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_aqweteddy__llama_chat-tv_en_luban-tv_stable_platypus2 | 2023-09-14T19:16:11.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of aqweteddy/llama_chat-tv_en_luban-tv_stable_platypus2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [aqweteddy/llama_chat-tv_en_luban-tv_stable_platypus2](https://huggingface.co/aqweteddy/llama_chat-tv_en_luban-tv_stable_platypus2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aqweteddy__llama_chat-tv_en_luban-tv_stable_platypus2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-14T19:14:45.418998](https://huggingface.co/datasets/open-llm-leaderboard/details_aqweteddy__llama_chat-tv_en_luban-tv_stable_platypus2/blob/main/results_2023-09-14T19-14-45.418998.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4945007463965186,\n\
\ \"acc_stderr\": 0.03527731102256181,\n \"acc_norm\": 0.49711825981073476,\n\
\ \"acc_norm_stderr\": 0.035276035393914426,\n \"mc1\": 0.32558139534883723,\n\
\ \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.5188093512935639,\n\
\ \"mc2_stderr\": 0.016351300657386426\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4351535836177474,\n \"acc_stderr\": 0.014487986197186043,\n\
\ \"acc_norm\": 0.4453924914675768,\n \"acc_norm_stderr\": 0.01452398763834409\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4660426209918343,\n\
\ \"acc_stderr\": 0.004978260641742204,\n \"acc_norm\": 0.6102370045807608,\n\
\ \"acc_norm_stderr\": 0.004866997110388195\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5320754716981132,\n \"acc_stderr\": 0.030709486992556538,\n\
\ \"acc_norm\": 0.5320754716981132,\n \"acc_norm_stderr\": 0.030709486992556538\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.4797687861271676,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.032321469162244695,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.032321469162244695\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.02326651221373057,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02326651221373057\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.535483870967742,\n\
\ \"acc_stderr\": 0.028372287797962935,\n \"acc_norm\": 0.535483870967742,\n\
\ \"acc_norm_stderr\": 0.028372287797962935\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.03422398565657551,\n\
\ \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.03422398565657551\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03756335775187896,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03756335775187896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6515151515151515,\n \"acc_stderr\": 0.033948539651564025,\n \"\
acc_norm\": 0.6515151515151515,\n \"acc_norm_stderr\": 0.033948539651564025\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7461139896373057,\n \"acc_stderr\": 0.0314102478056532,\n\
\ \"acc_norm\": 0.7461139896373057,\n \"acc_norm_stderr\": 0.0314102478056532\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5153846153846153,\n \"acc_stderr\": 0.025339003010106515,\n\
\ \"acc_norm\": 0.5153846153846153,\n \"acc_norm_stderr\": 0.025339003010106515\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.02752859921034049,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.02752859921034049\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03242225027115006,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03242225027115006\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6495412844036698,\n \"acc_stderr\": 0.020456077599824467,\n \"\
acc_norm\": 0.6495412844036698,\n \"acc_norm_stderr\": 0.020456077599824467\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.28703703703703703,\n \"acc_stderr\": 0.030851992993257013,\n \"\
acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.030851992993257013\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5784313725490197,\n \"acc_stderr\": 0.03465868196380762,\n \"\
acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.03465868196380762\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6582278481012658,\n \"acc_stderr\": 0.030874537537553617,\n \
\ \"acc_norm\": 0.6582278481012658,\n \"acc_norm_stderr\": 0.030874537537553617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5739910313901345,\n\
\ \"acc_stderr\": 0.033188332862172806,\n \"acc_norm\": 0.5739910313901345,\n\
\ \"acc_norm_stderr\": 0.033188332862172806\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.0435644720266507,\n\
\ \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.0435644720266507\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6033057851239669,\n \"acc_stderr\": 0.04465869780531009,\n \"\
acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.04465869780531009\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n\
\ \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.5648148148148148,\n\
\ \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5705521472392638,\n \"acc_stderr\": 0.03889066619112723,\n\
\ \"acc_norm\": 0.5705521472392638,\n \"acc_norm_stderr\": 0.03889066619112723\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.026853450377009157,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.026853450377009157\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6730523627075351,\n\
\ \"acc_stderr\": 0.016774908180131467,\n \"acc_norm\": 0.6730523627075351,\n\
\ \"acc_norm_stderr\": 0.016774908180131467\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.02690784985628254,\n\
\ \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.02690784985628254\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41564245810055866,\n\
\ \"acc_stderr\": 0.016482782187500662,\n \"acc_norm\": 0.41564245810055866,\n\
\ \"acc_norm_stderr\": 0.016482782187500662\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5163398692810458,\n \"acc_stderr\": 0.028614624752805427,\n\
\ \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.028614624752805427\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5209003215434084,\n\
\ \"acc_stderr\": 0.02837327096106942,\n \"acc_norm\": 0.5209003215434084,\n\
\ \"acc_norm_stderr\": 0.02837327096106942\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.027801656212323674,\n\
\ \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.027801656212323674\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.38652482269503546,\n \"acc_stderr\": 0.029049190342543465,\n \
\ \"acc_norm\": 0.38652482269503546,\n \"acc_norm_stderr\": 0.029049190342543465\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3728813559322034,\n\
\ \"acc_stderr\": 0.012350630058333353,\n \"acc_norm\": 0.3728813559322034,\n\
\ \"acc_norm_stderr\": 0.012350630058333353\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.40441176470588236,\n \"acc_stderr\": 0.029812630701569743,\n\
\ \"acc_norm\": 0.40441176470588236,\n \"acc_norm_stderr\": 0.029812630701569743\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.45588235294117646,\n \"acc_stderr\": 0.020148939420415738,\n \
\ \"acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.020148939420415738\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5306122448979592,\n \"acc_stderr\": 0.031949171367580624,\n\
\ \"acc_norm\": 0.5306122448979592,\n \"acc_norm_stderr\": 0.031949171367580624\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.572139303482587,\n\
\ \"acc_stderr\": 0.03498541988407795,\n \"acc_norm\": 0.572139303482587,\n\
\ \"acc_norm_stderr\": 0.03498541988407795\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n\
\ \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n\
\ \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32558139534883723,\n\
\ \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.5188093512935639,\n\
\ \"mc2_stderr\": 0.016351300657386426\n }\n}\n```"
repo_url: https://huggingface.co/aqweteddy/llama_chat-tv_en_luban-tv_stable_platypus2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|arc:challenge|25_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hellaswag|10_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T19-14-45.418998.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T19-14-45.418998.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T19-14-45.418998.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T19-14-45.418998.parquet'
- config_name: results
data_files:
- split: 2023_09_14T19_14_45.418998
path:
- results_2023-09-14T19-14-45.418998.parquet
- split: latest
path:
- results_2023-09-14T19-14-45.418998.parquet
---
# Dataset Card for Evaluation run of aqweteddy/llama_chat-tv_en_luban-tv_stable_platypus2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/aqweteddy/llama_chat-tv_en_luban-tv_stable_platypus2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [aqweteddy/llama_chat-tv_en_luban-tv_stable_platypus2](https://huggingface.co/aqweteddy/llama_chat-tv_en_luban-tv_stable_platypus2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aqweteddy__llama_chat-tv_en_luban-tv_stable_platypus2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-14T19:14:45.418998](https://huggingface.co/datasets/open-llm-leaderboard/details_aqweteddy__llama_chat-tv_en_luban-tv_stable_platypus2/blob/main/results_2023-09-14T19-14-45.418998.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4945007463965186,
"acc_stderr": 0.03527731102256181,
"acc_norm": 0.49711825981073476,
"acc_norm_stderr": 0.035276035393914426,
"mc1": 0.32558139534883723,
"mc1_stderr": 0.016403989469907825,
"mc2": 0.5188093512935639,
"mc2_stderr": 0.016351300657386426
},
"harness|arc:challenge|25": {
"acc": 0.4351535836177474,
"acc_stderr": 0.014487986197186043,
"acc_norm": 0.4453924914675768,
"acc_norm_stderr": 0.01452398763834409
},
"harness|hellaswag|10": {
"acc": 0.4660426209918343,
"acc_stderr": 0.004978260641742204,
"acc_norm": 0.6102370045807608,
"acc_norm_stderr": 0.004866997110388195
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5320754716981132,
"acc_stderr": 0.030709486992556538,
"acc_norm": 0.5320754716981132,
"acc_norm_stderr": 0.030709486992556538
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.032321469162244695,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.032321469162244695
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02326651221373057,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02326651221373057
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.535483870967742,
"acc_stderr": 0.028372287797962935,
"acc_norm": 0.535483870967742,
"acc_norm_stderr": 0.028372287797962935
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.03422398565657551,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.03422398565657551
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03756335775187896,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03756335775187896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6515151515151515,
"acc_stderr": 0.033948539651564025,
"acc_norm": 0.6515151515151515,
"acc_norm_stderr": 0.033948539651564025
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7461139896373057,
"acc_stderr": 0.0314102478056532,
"acc_norm": 0.7461139896373057,
"acc_norm_stderr": 0.0314102478056532
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5153846153846153,
"acc_stderr": 0.025339003010106515,
"acc_norm": 0.5153846153846153,
"acc_norm_stderr": 0.025339003010106515
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.02752859921034049,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.02752859921034049
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03242225027115006,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03242225027115006
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6495412844036698,
"acc_stderr": 0.020456077599824467,
"acc_norm": 0.6495412844036698,
"acc_norm_stderr": 0.020456077599824467
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.030851992993257013,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.030851992993257013
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.03465868196380762,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.03465868196380762
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6582278481012658,
"acc_stderr": 0.030874537537553617,
"acc_norm": 0.6582278481012658,
"acc_norm_stderr": 0.030874537537553617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5739910313901345,
"acc_stderr": 0.033188332862172806,
"acc_norm": 0.5739910313901345,
"acc_norm_stderr": 0.033188332862172806
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.0435644720266507,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.0435644720266507
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6033057851239669,
"acc_stderr": 0.04465869780531009,
"acc_norm": 0.6033057851239669,
"acc_norm_stderr": 0.04465869780531009
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.04792898170907061,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.04792898170907061
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5705521472392638,
"acc_stderr": 0.03889066619112723,
"acc_norm": 0.5705521472392638,
"acc_norm_stderr": 0.03889066619112723
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.0465614711001235,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.0465614711001235
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.026853450377009157,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.026853450377009157
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6730523627075351,
"acc_stderr": 0.016774908180131467,
"acc_norm": 0.6730523627075351,
"acc_norm_stderr": 0.016774908180131467
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.02690784985628254,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.02690784985628254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41564245810055866,
"acc_stderr": 0.016482782187500662,
"acc_norm": 0.41564245810055866,
"acc_norm_stderr": 0.016482782187500662
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.028614624752805427,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.028614624752805427
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5209003215434084,
"acc_stderr": 0.02837327096106942,
"acc_norm": 0.5209003215434084,
"acc_norm_stderr": 0.02837327096106942
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.027801656212323674,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.027801656212323674
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.38652482269503546,
"acc_stderr": 0.029049190342543465,
"acc_norm": 0.38652482269503546,
"acc_norm_stderr": 0.029049190342543465
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3728813559322034,
"acc_stderr": 0.012350630058333353,
"acc_norm": 0.3728813559322034,
"acc_norm_stderr": 0.012350630058333353
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.40441176470588236,
"acc_stderr": 0.029812630701569743,
"acc_norm": 0.40441176470588236,
"acc_norm_stderr": 0.029812630701569743
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.45588235294117646,
"acc_stderr": 0.020148939420415738,
"acc_norm": 0.45588235294117646,
"acc_norm_stderr": 0.020148939420415738
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5306122448979592,
"acc_stderr": 0.031949171367580624,
"acc_norm": 0.5306122448979592,
"acc_norm_stderr": 0.031949171367580624
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.572139303482587,
"acc_stderr": 0.03498541988407795,
"acc_norm": 0.572139303482587,
"acc_norm_stderr": 0.03498541988407795
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32558139534883723,
"mc1_stderr": 0.016403989469907825,
"mc2": 0.5188093512935639,
"mc2_stderr": 0.016351300657386426
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/nakano_yuka_idolmastercinderellagirls | 2023-09-17T17:37:48.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nakano_yuka (THE iDOLM@STER: Cinderella Girls)
This is the dataset of nakano_yuka (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 551 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 551 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 551 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 551 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.