id stringlengths 2 115 | author stringlengths 2 42 ⌀ | last_modified timestamp[us, tz=UTC] | downloads int64 0 8.87M | likes int64 0 3.84k | paperswithcode_id stringlengths 2 45 ⌀ | tags list | lastModified timestamp[us, tz=UTC] | createdAt stringlengths 24 24 | key stringclasses 1 value | created timestamp[us] | card stringlengths 1 1.01M | embedding list | library_name stringclasses 21 values | pipeline_tag stringclasses 27 values | mask_token null | card_data null | widget_data null | model_index null | config null | transformers_info null | spaces null | safetensors null | transformersInfo null | modelId stringlengths 5 111 ⌀ | embeddings list |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
atom-in-the-universe/bild-deduped-72 | atom-in-the-universe | 2023-11-28T07:56:56Z | 0 | 0 | null | [
"region:us"
] | 2023-11-28T07:56:56Z | 2023-11-14T17:11:39.000Z | 2023-11-14T17:11:39 | Entry not found | [
-0.32276472449302673,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965679168701,
0.7915717363357544,
0.07618629932403564,
0.7746022939682007,
0.2563222646713257,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
xieyizheng/normal_smal | xieyizheng | 2023-11-14T17:40:37Z | 0 | 0 | null | [
"region:us"
] | 2023-11-14T17:40:37Z | 2023-11-14T17:38:05.000Z | 2023-11-14T17:38:05 | Entry not found | [
-0.32276472449302673,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965679168701,
0.7915717363357544,
0.07618629932403564,
0.7746022939682007,
0.2563222646713257,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
ajibawa-2023/SlimOrca-ShareGPT | ajibawa-2023 | 2023-11-14T20:06:58Z | 0 | 3 | null | [
"task_categories:token-classification",
"task_categories:text-classification",
"size_categories:100K<n<1M",
"language:en",
"license:mit",
"region:us"
] | 2023-11-14T20:06:58Z | 2023-11-14T17:42:27.000Z | 2023-11-14T17:42:27 | ---
license: mit
language:
- en
size_categories:
- 100K<n<1M
task_categories:
- token-classification
- text-classification
pretty_name: SoS
---
**SlimOrca-ShareGPT**
This dataset is in Vicuna/ShareGPT format. There are 517981 set of conversations. Each set having 2 conversations.
Original dataset was released by [Open-Orca](https://huggingface.co/datasets/Open-Orca/SlimOrca). I have refined it so that "system" is not present.
Idea is to check how this dataset will perform on Llama-2 & Mistral Models. I will relese both models very soon.
Will this dataset help to improve performance of fine tuned model?
All the credit goes to the Open-Orca team for releasing Orca & SlimOrca datasets. | [
-0.26549845933914185,
-0.7281295657157898,
0.10813998430967331,
0.26063984632492065,
-0.4441489577293396,
-0.4189271330833435,
-0.09675522148609161,
-0.6334487199783325,
0.6646102666854858,
0.7652232050895691,
-0.8344933390617371,
-0.1935104876756668,
-0.3767731189727783,
-0.14793461561203... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Glazastik/rutextdataset | Glazastik | 2023-11-14T17:46:48Z | 0 | 0 | null | [
"language:ru",
"language:fr",
"region:us"
] | 2023-11-14T17:46:48Z | 2023-11-14T17:45:45.000Z | 2023-11-14T17:45:45 | ---
language:
- ru
- fr
---
---
language:
- ru
This dataset is a texts in Russian and French | [
0.31953486800193787,
-0.5479445457458496,
0.16613277792930603,
0.3617889881134033,
-0.28676486015319824,
0.18226413428783417,
-0.018885040655732155,
-0.0716620609164238,
0.348402738571167,
0.8431402444839478,
-0.7698493003845215,
-0.5739554166793823,
-0.7696347236633301,
0.212668314576149,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Sinall/oshimsgs | Sinall | 2023-11-14T17:52:04Z | 0 | 0 | null | [
"license:other",
"region:us"
] | 2023-11-14T17:52:04Z | 2023-11-14T17:51:21.000Z | 2023-11-14T17:51:21 | ---
license: other
license_name: by-nc-sa
license_link: https://creativecommons.org/licenses/by-nc-sa/4.0/
dataset_info:
features:
- name: message
dtype: string
splits:
- name: train
num_bytes: 322094890
num_examples: 8060059
download_size: 220255204
dataset_size: 322094890
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
xwar/ninox_training | xwar | 2023-11-14T18:34:54Z | 0 | 0 | null | [
"region:us"
] | 2023-11-14T18:34:54Z | 2023-11-14T18:34:27.000Z | 2023-11-14T18:34:27 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Parmazanek/patrick | Parmazanek | 2023-11-14T18:51:53Z | 0 | 0 | null | [
"region:us"
] | 2023-11-14T18:51:53Z | 2023-11-14T18:51:19.000Z | 2023-11-14T18:51:19 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
shyamsubbu/java_ds | shyamsubbu | 2023-11-15T20:10:16Z | 0 | 1 | null | [
"region:us"
] | 2023-11-15T20:10:16Z | 2023-11-14T19:06:03.000Z | 2023-11-14T19:06:03 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_openaccess-ai-collective__mistral-7b-slimorcaboros_public | open-llm-leaderboard | 2023-11-14T19:10:08Z | 0 | 0 | null | [
"region:us"
] | 2023-11-14T19:10:08Z | 2023-11-14T19:09:15.000Z | 2023-11-14T19:09:15 | ---
pretty_name: Evaluation run of openaccess-ai-collective/mistral-7b-slimorcaboros
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openaccess-ai-collective/mistral-7b-slimorcaboros](https://huggingface.co/openaccess-ai-collective/mistral-7b-slimorcaboros)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openaccess-ai-collective__mistral-7b-slimorcaboros_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-14T19:06:13.668768](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__mistral-7b-slimorcaboros_public/blob/main/results_2023-11-14T19-06-13.668768.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6301042082006363,\n\
\ \"acc_stderr\": 0.032164201740811346,\n \"acc_norm\": 0.6380190670382948,\n\
\ \"acc_norm_stderr\": 0.03283508976201021,\n \"mc1\": 0.390452876376989,\n\
\ \"mc1_stderr\": 0.01707823074343145,\n \"mc2\": 0.5581158489169444,\n\
\ \"mc2_stderr\": 0.01565820515437776,\n \"em\": 0.03859060402684564,\n\
\ \"em_stderr\": 0.001972579977587539,\n \"f1\": 0.11617135067114018,\n\
\ \"f1_stderr\": 0.0024204909854951134\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6117747440273038,\n \"acc_stderr\": 0.014241614207414054,\n\
\ \"acc_norm\": 0.636518771331058,\n \"acc_norm_stderr\": 0.014056207319068285\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.650368452499502,\n\
\ \"acc_stderr\": 0.004758790172436686,\n \"acc_norm\": 0.8369846644094802,\n\
\ \"acc_norm_stderr\": 0.0036862475593618512\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013316,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013316\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.029146904747798328,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.029146904747798328\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\"\
: 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723292,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723292\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.0315841532404771,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.0315841532404771\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6435897435897436,\n \"acc_stderr\": 0.02428314052946731,\n \
\ \"acc_norm\": 0.6435897435897436,\n \"acc_norm_stderr\": 0.02428314052946731\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230186,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230186\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.01570349834846177,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.01570349834846177\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.02646056956124063,\n\
\ \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.02646056956124063\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579654,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579654\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7309417040358744,\n\
\ \"acc_stderr\": 0.02976377940687497,\n \"acc_norm\": 0.7309417040358744,\n\
\ \"acc_norm_stderr\": 0.02976377940687497\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834829,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834829\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3787709497206704,\n\
\ \"acc_stderr\": 0.01622353351036512,\n \"acc_norm\": 0.3787709497206704,\n\
\ \"acc_norm_stderr\": 0.01622353351036512\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537368,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537368\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.48435462842242505,\n \"acc_stderr\": 0.01276398283812096,\n\
\ \"acc_norm\": 0.48435462842242505,\n \"acc_norm_stderr\": 0.01276398283812096\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n \"\
acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6388888888888888,\n \"acc_stderr\": 0.01943177567703731,\n \
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.01943177567703731\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274645,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274645\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.390452876376989,\n\
\ \"mc1_stderr\": 0.01707823074343145,\n \"mc2\": 0.5581158489169444,\n\
\ \"mc2_stderr\": 0.01565820515437776\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838238\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.03859060402684564,\n \
\ \"em_stderr\": 0.001972579977587539,\n \"f1\": 0.11617135067114018,\n\
\ \"f1_stderr\": 0.0024204909854951134\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.23426838514025777,\n \"acc_stderr\": 0.011666415127631034\n\
\ }\n}\n```"
repo_url: https://huggingface.co/openaccess-ai-collective/mistral-7b-slimorcaboros
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|arc:challenge|25_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|drop|3_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|gsm8k|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hellaswag|10_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-14T19-06-13.668768.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-14T19-06-13.668768.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- '**/details_harness|winogrande|5_2023-11-14T19-06-13.668768.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-14T19-06-13.668768.parquet'
- config_name: results
data_files:
- split: 2023_11_14T19_06_13.668768
path:
- results_2023-11-14T19-06-13.668768.parquet
- split: latest
path:
- results_2023-11-14T19-06-13.668768.parquet
---
# Dataset Card for Evaluation run of openaccess-ai-collective/mistral-7b-slimorcaboros
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openaccess-ai-collective/mistral-7b-slimorcaboros
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openaccess-ai-collective/mistral-7b-slimorcaboros](https://huggingface.co/openaccess-ai-collective/mistral-7b-slimorcaboros) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openaccess-ai-collective__mistral-7b-slimorcaboros_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-14T19:06:13.668768](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__mistral-7b-slimorcaboros_public/blob/main/results_2023-11-14T19-06-13.668768.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6301042082006363,
"acc_stderr": 0.032164201740811346,
"acc_norm": 0.6380190670382948,
"acc_norm_stderr": 0.03283508976201021,
"mc1": 0.390452876376989,
"mc1_stderr": 0.01707823074343145,
"mc2": 0.5581158489169444,
"mc2_stderr": 0.01565820515437776,
"em": 0.03859060402684564,
"em_stderr": 0.001972579977587539,
"f1": 0.11617135067114018,
"f1_stderr": 0.0024204909854951134
},
"harness|arc:challenge|25": {
"acc": 0.6117747440273038,
"acc_stderr": 0.014241614207414054,
"acc_norm": 0.636518771331058,
"acc_norm_stderr": 0.014056207319068285
},
"harness|hellaswag|10": {
"acc": 0.650368452499502,
"acc_stderr": 0.004758790172436686,
"acc_norm": 0.8369846644094802,
"acc_norm_stderr": 0.0036862475593618512
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013316,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013316
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.029146904747798328,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.029146904747798328
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.0315841532404771,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.0315841532404771
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6435897435897436,
"acc_stderr": 0.02428314052946731,
"acc_norm": 0.6435897435897436,
"acc_norm_stderr": 0.02428314052946731
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230186,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230186
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.01570349834846177,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.01570349834846177
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.02646056956124063,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.02646056956124063
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579654,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579654
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7309417040358744,
"acc_stderr": 0.02976377940687497,
"acc_norm": 0.7309417040358744,
"acc_norm_stderr": 0.02976377940687497
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834829,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834829
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3787709497206704,
"acc_stderr": 0.01622353351036512,
"acc_norm": 0.3787709497206704,
"acc_norm_stderr": 0.01622353351036512
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.024748624490537368,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.024748624490537368
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48435462842242505,
"acc_stderr": 0.01276398283812096,
"acc_norm": 0.48435462842242505,
"acc_norm_stderr": 0.01276398283812096
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.01943177567703731,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.01943177567703731
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274645,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727668,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727668
},
"harness|truthfulqa:mc|0": {
"mc1": 0.390452876376989,
"mc1_stderr": 0.01707823074343145,
"mc2": 0.5581158489169444,
"mc2_stderr": 0.01565820515437776
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838238
},
"harness|drop|3": {
"em": 0.03859060402684564,
"em_stderr": 0.001972579977587539,
"f1": 0.11617135067114018,
"f1_stderr": 0.0024204909854951134
},
"harness|gsm8k|5": {
"acc": 0.23426838514025777,
"acc_stderr": 0.011666415127631034
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.737860381603241,
-0.8397073149681091,
0.25926655530929565,
0.18014736473560333,
-0.1585623025894165,
-0.0782749205827713,
-0.0033035501837730408,
-0.2210664004087448,
0.5848891139030457,
-0.0189806018024683,
-0.4833366572856903,
-0.7096142172813416,
-0.4251250922679901,
0.24495950341224... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
cosminn82/enron_dev_tst | cosminn82 | 2023-11-14T19:22:14Z | 0 | 0 | null | [
"region:us"
] | 2023-11-14T19:22:14Z | 2023-11-14T19:21:27.000Z | 2023-11-14T19:21:27 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
incivility-UOH/TwitCivility | incivility-UOH | 2023-11-16T16:12:02Z | 0 | 0 | null | [
"task_categories:text-classification",
"language:en",
"license:mit",
"arxiv:2305.14964",
"region:us"
] | 2023-11-16T16:12:02Z | 2023-11-14T19:25:02.000Z | 2023-11-14T19:25:02 | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
- name: impoliteness
dtype: int64
- name: intolerance
dtype: int64
splits:
- name: train
num_bytes: 2169574.4014020115
num_examples: 10498
- name: test
num_bytes: 542703.5985979884
num_examples: 2626
download_size: 1726706
dataset_size: 2712278
task_categories:
- text-classification
language:
- en
---
## Overview
The TwitCivility dataset is specifically developed to classify political incivility, focusing on multidimensional aspects of impoliteness and intolerance.
Detailed methodologies are outlined in our [paper](https://arxiv.org/abs/2305.14964).
## Languages
All text is written in English.
## Dataset Structure
### Data Fields
We release TwitCivility as a data frame with the following fields: <br />
**text**: This field contains the text (after preprocessing and anonymization) of the tweet. <br />
**impoliteness**: A binary indicator (1 or 0) representing the presence of impoliteness in the text. A value of 1 signifies impoliteness, while 0 indicates non-impoliteness. <br />
**intolerance**: Similarly, this binary value denotes the presence of intolerance in the text, with 1 indicating intolerance and 0 signifying non-intolerance. <br />
## Citation Information
```
@misc{incivility2023,
title={Detecting Multidimensional Political Incivility on Social Media},
author={Sagi Pendzel and Nir Lotan and Alon Zoizner and Einat Minkov},
year={2023},
eprint={2305.14964},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | [
-0.10652565211057663,
-0.6938428282737732,
0.26488885283470154,
0.4262814223766327,
-0.07171973586082458,
0.31324678659439087,
-0.36753472685813904,
-0.07223858684301376,
0.5235442519187927,
0.05352408438920975,
-0.595931887626648,
-0.8656339645385742,
-0.8569871783256531,
-0.0248268395662... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
jlbaker361/multi_whole | jlbaker361 | 2023-11-14T19:29:11Z | 0 | 0 | null | [
"region:us"
] | 2023-11-14T19:29:11Z | 2023-11-14T19:29:10.000Z | 2023-11-14T19:29:10 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 203716
num_examples: 10000
download_size: 107015
dataset_size: 203716
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "multi_whole"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.6259769201278687,
-0.42154037952423096,
0.23766012489795685,
0.4328545331954956,
-0.4456954896450043,
-0.09768880158662796,
0.16296058893203735,
-0.18524526059627533,
1.088565707206726,
0.5490136742591858,
-0.850525438785553,
-0.6744116544723511,
-0.6534767150878906,
-0.1987884491682052... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Squexz/Boywithuke | Squexz | 2023-11-14T19:43:06Z | 0 | 0 | null | [
"region:us"
] | 2023-11-14T19:43:06Z | 2023-11-14T19:43:06.000Z | 2023-11-14T19:43:06 | Entry not found | [
-0.32276487350463867,
-0.22568444907665253,
0.8622263073921204,
0.43461570143699646,
-0.5282988548278809,
0.7012969255447388,
0.7915717363357544,
0.07618642598390579,
0.7746027112007141,
0.25632190704345703,
-0.7852815389633179,
-0.22573848068714142,
-0.910447895526886,
0.5715675354003906,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Taylor658/med_train_nov23 | Taylor658 | 2023-11-18T09:38:38Z | 0 | 0 | null | [
"region:us"
] | 2023-11-18T09:38:38Z | 2023-11-14T19:45:19.000Z | 2023-11-14T19:45:19 | ---
license: apache-2.0
# Dataset Card for Medical Question Answering Dataset
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
## Dataset Description
### Dataset Summary
This dataset contains a collection of question-answer pairs related to various medical topics. The data is structured to provide comprehensive answers to specific medical questions, covering information, diagnosis, treatment, prevention, and susceptibility related to different health conditions.
### Supported Tasks and Leaderboards
The dataset is suitable for tasks like medical question answering, natural language understanding, and information retrieval in the healthcare domain.
### Languages
The dataset is primarily in English.
## Dataset Structure
### Data Instances
An example from the dataset:
- Question: "What are the treatments for acanthamoeba?"
- Answer: "Early diagnosis is essential for effective treatment of acanthamoeba..."
### Data Fields
- `question`: The medical question.
- `answer`: The answer to the medical question.
### Data Splits
The dataset is not split into training, validation, or test sets. Users may need to create these splits if required.
## Dataset Creation
### Curation Rationale
This dataset was created to facilitate research and development in medical question answering systems, aiming to improve access to medical information.
### Source Data
The data was compiled from various medical resources and designed to be comprehensive and informative.
### Annotations
Not applicable as the dataset consists of pre-existing question-answer pairs.
### Personal and Sensitive Information
Questions and answers do not contain personal information. However, users should be cautious when integrating this data into applications, considering privacy and ethical implications.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset can aid in developing systems that provide quick and accurate medical information, potentially improving healthcare outcomes.
### Discussion of Biases
There are no known biases in the dataset. However, users should be aware of the potential for bias in language models trained on this data.
### Other Known Limitations
The dataset might not cover all possible medical questions and might be limited in scope regarding certain medical conditions.
## Additional Information
| [
-0.2894933223724365,
-1.1412149667739868,
0.4556209444999695,
0.13803927600383759,
-0.28384605050086975,
-0.15915453433990479,
0.020864170044660568,
-0.4282567799091339,
0.484127938747406,
0.6386244893074036,
-0.8542453050613403,
-0.8395451307296753,
-0.6065207719802856,
0.2437990754842758... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
KnutJaegersberg/longinstruct | KnutJaegersberg | 2023-11-14T19:59:26Z | 0 | 2 | null | [
"license:mit",
"region:us"
] | 2023-11-14T19:59:26Z | 2023-11-14T19:58:15.000Z | 2023-11-14T19:58:15 | ---
license: mit
---
| [
-0.1285339742898941,
-0.18616800010204315,
0.6529127359390259,
0.4943626821041107,
-0.1931934952735901,
0.2360742688179016,
0.360720157623291,
0.05056300014257431,
0.5793654322624207,
0.7400140166282654,
-0.6508105993270874,
-0.23783984780311584,
-0.7102248668670654,
-0.047826044261455536,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Lukinha012/00poze | Lukinha012 | 2023-11-14T20:08:25Z | 0 | 0 | null | [
"region:us"
] | 2023-11-14T20:08:25Z | 2023-11-14T20:06:59.000Z | 2023-11-14T20:06:59 | Entry not found | [
-0.32276487350463867,
-0.22568444907665253,
0.8622263073921204,
0.43461570143699646,
-0.5282988548278809,
0.7012969255447388,
0.7915717363357544,
0.07618642598390579,
0.7746027112007141,
0.25632190704345703,
-0.7852815389633179,
-0.22573848068714142,
-0.910447895526886,
0.5715675354003906,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Tiefighter_public | open-llm-leaderboard | 2023-11-14T20:29:00Z | 0 | 0 | null | [
"region:us"
] | 2023-11-14T20:29:00Z | 2023-11-14T20:28:13.000Z | 2023-11-14T20:28:13 | ---
pretty_name: Evaluation run of KoboldAI/LLaMA2-13B-Tiefighter
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KoboldAI/LLaMA2-13B-Tiefighter](https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Tiefighter_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-14T20:25:09.144693](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Tiefighter_public/blob/main/results_2023-11-14T20-25-09.144693.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5460312696004332,\n\
\ \"acc_stderr\": 0.03357446611113244,\n \"acc_norm\": 0.5555362057698711,\n\
\ \"acc_norm_stderr\": 0.03444530254256153,\n \"mc1\": 0.379436964504284,\n\
\ \"mc1_stderr\": 0.016987039266142985,\n \"mc2\": 0.5301656358073983,\n\
\ \"mc2_stderr\": 0.01568757011022921,\n \"em\": 0.11115771812080537,\n\
\ \"em_stderr\": 0.00321900621779521,\n \"f1\": 0.1838915687919454,\n\
\ \"f1_stderr\": 0.0033646558993111948\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.568259385665529,\n \"acc_stderr\": 0.014474591427196202,\n\
\ \"acc_norm\": 0.5989761092150171,\n \"acc_norm_stderr\": 0.014322255790719867\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6500697072296355,\n\
\ \"acc_stderr\": 0.004759729267943188,\n \"acc_norm\": 0.8399721171081458,\n\
\ \"acc_norm_stderr\": 0.003658826208101615\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04017901275981749,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04017901275981749\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.030151134457776285,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.030151134457776285\n \
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\
\ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n\
\ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006716,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006716\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.04793724854411021,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.04793724854411021\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523857,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523857\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147125,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147125\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n\
\ \"acc_stderr\": 0.027327548447957536,\n \"acc_norm\": 0.6387096774193548,\n\
\ \"acc_norm_stderr\": 0.027327548447957536\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406795,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406795\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.03003114797764154,\n\
\ \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.03003114797764154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5051282051282051,\n \"acc_stderr\": 0.025349672906838653,\n\
\ \"acc_norm\": 0.5051282051282051,\n \"acc_norm_stderr\": 0.025349672906838653\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871937,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871937\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182087,\n \
\ \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182087\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7211009174311926,\n \"acc_stderr\": 0.019227468876463507,\n \"\
acc_norm\": 0.7211009174311926,\n \"acc_norm_stderr\": 0.019227468876463507\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37962962962962965,\n \"acc_stderr\": 0.03309682581119035,\n \"\
acc_norm\": 0.37962962962962965,\n \"acc_norm_stderr\": 0.03309682581119035\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.0283046579430353,\n\
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.0283046579430353\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.042607351576445594,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.042607351576445594\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n\
\ \"acc_stderr\": 0.025372139671722933,\n \"acc_norm\": 0.8162393162393162,\n\
\ \"acc_norm_stderr\": 0.025372139671722933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7458492975734355,\n\
\ \"acc_stderr\": 0.015569254692045752,\n \"acc_norm\": 0.7458492975734355,\n\
\ \"acc_norm_stderr\": 0.015569254692045752\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6213872832369942,\n \"acc_stderr\": 0.026113749361310345,\n\
\ \"acc_norm\": 0.6213872832369942,\n \"acc_norm_stderr\": 0.026113749361310345\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33631284916201115,\n\
\ \"acc_stderr\": 0.0158010037291459,\n \"acc_norm\": 0.33631284916201115,\n\
\ \"acc_norm_stderr\": 0.0158010037291459\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.027826109307283697,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.027826109307283697\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\
\ \"acc_stderr\": 0.027648149599751464,\n \"acc_norm\": 0.6141479099678456,\n\
\ \"acc_norm_stderr\": 0.027648149599751464\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271146,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271146\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573083,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573083\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4335071707953064,\n\
\ \"acc_stderr\": 0.012656810383983965,\n \"acc_norm\": 0.4335071707953064,\n\
\ \"acc_norm_stderr\": 0.012656810383983965\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.030332578094555033,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.030332578094555033\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5604575163398693,\n \"acc_stderr\": 0.020079420408087918,\n \
\ \"acc_norm\": 0.5604575163398693,\n \"acc_norm_stderr\": 0.020079420408087918\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030802,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030802\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.03152439186555401,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.03152439186555401\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.379436964504284,\n\
\ \"mc1_stderr\": 0.016987039266142985,\n \"mc2\": 0.5301656358073983,\n\
\ \"mc2_stderr\": 0.01568757011022921\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.745067087608524,\n \"acc_stderr\": 0.012248806969376422\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.11115771812080537,\n \
\ \"em_stderr\": 0.00321900621779521,\n \"f1\": 0.1838915687919454,\n \
\ \"f1_stderr\": 0.0033646558993111948\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.006823351023502654,\n \"acc_stderr\": 0.002267537102254515\n\
\ }\n}\n```"
repo_url: https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|arc:challenge|25_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|drop|3_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|gsm8k|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hellaswag|10_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|winogrande|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-14T20-25-09.144693.parquet'
- config_name: results
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- results_2023-11-14T20-25-09.144693.parquet
- split: latest
path:
- results_2023-11-14T20-25-09.144693.parquet
---
# Dataset Card for Evaluation run of KoboldAI/LLaMA2-13B-Tiefighter
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KoboldAI/LLaMA2-13B-Tiefighter](https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Tiefighter_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-14T20:25:09.144693](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Tiefighter_public/blob/main/results_2023-11-14T20-25-09.144693.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5460312696004332,
"acc_stderr": 0.03357446611113244,
"acc_norm": 0.5555362057698711,
"acc_norm_stderr": 0.03444530254256153,
"mc1": 0.379436964504284,
"mc1_stderr": 0.016987039266142985,
"mc2": 0.5301656358073983,
"mc2_stderr": 0.01568757011022921,
"em": 0.11115771812080537,
"em_stderr": 0.00321900621779521,
"f1": 0.1838915687919454,
"f1_stderr": 0.0033646558993111948
},
"harness|arc:challenge|25": {
"acc": 0.568259385665529,
"acc_stderr": 0.014474591427196202,
"acc_norm": 0.5989761092150171,
"acc_norm_stderr": 0.014322255790719867
},
"harness|hellaswag|10": {
"acc": 0.6500697072296355,
"acc_stderr": 0.004759729267943188,
"acc_norm": 0.8399721171081458,
"acc_norm_stderr": 0.003658826208101615
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.030151134457776285,
"acc_norm": 0.6,
"acc_norm_stderr": 0.030151134457776285
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006716,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006716
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523857,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523857
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147125,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147125
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.027327548447957536,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.027327548447957536
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406795,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406795
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7772020725388601,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.7772020725388601,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5051282051282051,
"acc_stderr": 0.025349672906838653,
"acc_norm": 0.5051282051282051,
"acc_norm_stderr": 0.025349672906838653
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871937,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871937
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5462184873949579,
"acc_stderr": 0.03233943468182087,
"acc_norm": 0.5462184873949579,
"acc_norm_stderr": 0.03233943468182087
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7211009174311926,
"acc_stderr": 0.019227468876463507,
"acc_norm": 0.7211009174311926,
"acc_norm_stderr": 0.019227468876463507
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.0283046579430353,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.0283046579430353
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.025372139671722933,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.025372139671722933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7458492975734355,
"acc_stderr": 0.015569254692045752,
"acc_norm": 0.7458492975734355,
"acc_norm_stderr": 0.015569254692045752
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6213872832369942,
"acc_stderr": 0.026113749361310345,
"acc_norm": 0.6213872832369942,
"acc_norm_stderr": 0.026113749361310345
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33631284916201115,
"acc_stderr": 0.0158010037291459,
"acc_norm": 0.33631284916201115,
"acc_norm_stderr": 0.0158010037291459
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.027826109307283697,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.027826109307283697
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751464,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751464
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271146,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271146
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573083,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573083
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4335071707953064,
"acc_stderr": 0.012656810383983965,
"acc_norm": 0.4335071707953064,
"acc_norm_stderr": 0.012656810383983965
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.030332578094555033,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.030332578094555033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5604575163398693,
"acc_stderr": 0.020079420408087918,
"acc_norm": 0.5604575163398693,
"acc_norm_stderr": 0.020079420408087918
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030802,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030802
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555401,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555401
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.379436964504284,
"mc1_stderr": 0.016987039266142985,
"mc2": 0.5301656358073983,
"mc2_stderr": 0.01568757011022921
},
"harness|winogrande|5": {
"acc": 0.745067087608524,
"acc_stderr": 0.012248806969376422
},
"harness|drop|3": {
"em": 0.11115771812080537,
"em_stderr": 0.00321900621779521,
"f1": 0.1838915687919454,
"f1_stderr": 0.0033646558993111948
},
"harness|gsm8k|5": {
"acc": 0.006823351023502654,
"acc_stderr": 0.002267537102254515
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7224230766296387,
-0.845271646976471,
0.27361971139907837,
0.23749738931655884,
-0.22335241734981537,
-0.02999655343592167,
0.008222628384828568,
-0.22516843676567078,
0.5850669145584106,
-0.04290781915187836,
-0.48696959018707275,
-0.6427080035209656,
-0.45241641998291016,
0.2291223555... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
abrauf/DKA_dataset | abrauf | 2023-11-14T20:36:47Z | 0 | 0 | null | [
"license:cc",
"region:us"
] | 2023-11-14T20:36:47Z | 2023-11-14T20:35:30.000Z | 2023-11-14T20:35:30 | ---
license: cc
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
BreadboardLabs/CurioTreeData | BreadboardLabs | 2023-11-28T20:46:29Z | 0 | 0 | null | [
"size_categories:1M<n<10M",
"license:cc-by-nc-4.0",
"climate",
"trees",
"images",
"region:us"
] | 2023-11-28T20:46:29Z | 2023-11-14T20:37:05.000Z | 2023-11-14T20:37:05 | ---
license: cc-by-nc-4.0
tags:
- climate
- trees
- images
size_categories:
- 1M<n<10M
---
# The Curio Tree Dataset
This dataset contains much of the tree inventory, images and stories data that was collected on the [Curio platform](https://www.youtube.com/@curio-xyz7991/videos) before it was sunset. The data was extraced from a number of database tables and includes;
- The inventory details of 2.5 millions trees from locations across the globe (location, species, diameter at breast height (DBH), height, vitality etc, where available)
- 27,288 images of trees that were uploaded onto the platform by our community and linked to individual trees and their species information etc.
- Notes (stories), tags and conversations linked to trees.
### Dataset Description
Curio was an environmental education and outreach platform that was predominantly focused on urban forestry. It connected the various stakeholders involved in the management of urban forestry with the public and importantly made all data uploaded via its web and mobile apps publicly available. The platform was live from March 2016 until August 2023 when the maintainence overheads made its ongoing availability infeasible. Curio was supported in its early stages by two European Space Agency projects, through the [New Commons](https://business.esa.int/projects/new-commons) and [Curio Canopy](https://business.esa.int/projects/curio-canopy). A sense of the platform and how it worked can be found via the videos on its supporting [youtube channel](https://www.youtube.com/@curio-xyz7991/videos)
This repository contains much of the tree inventory, images and stories data that was collected on the platform via our community, projects we helped support and open data tree inventories we uploaded onto the platform. We are keen to make this data available for research purposes in the hope it might be of benefit to others and to further the efforts of our community.
We have endeavored to name as many of those great projects and data sources that were hosted on the Curio platform in the attribution section below. If there are any omissions or errors please contact us.
A related project involved generating a high resolution map of tree canopy cover for the Greater London Authority. Details of that project and dataset can be found on the [London Datastore Curio Canopy page](https://data.london.gov.uk/dataset/curio-canopy).
- **Curated by:** Breadboard Labs
- **License:** cc-by-nc-4.0
### Dataset Sources and Attribution
Many people picked up the app and contributed to the data that was collected. Curio was also used to support many great projects and initiatives. We have endeavoured to mention many of those projects below along with the open data tree inventories we uploaded onto the platform.
#### Collaborative projects supported by Curio
- [Morton Arboretum](https://mortonarb.org/) - [Chicago Regional Tree Initiative](https://chicagorti.org/programs/)
- [Dublin City Council’s Parks, Biodiversity and Landscape Services](https://www.dublincity.ie/residential/parks) & [School of Geography at University College Dublin](https://www.ucd.ie/geography) - [Tree Mapping Dublin](https://mappinggreendublin.com/)
- [Sacramento Tree Foundation](https://sactree.org/) - [Save the Elms Program](https://sactree.org/programs/monitoring-elms/)
- [Cambridge City Council](https://www.cambridge.gov.uk/) - [Cambridge City Canopy Programme](https://www.cambridge.gov.uk/cambridge-canopy-project)
- [Municipality of Oslo Agency for Urban Environment](https://www.visitoslo.com/en/product/?tlp=593685) - Inventory and ecosystem services report hosting
- [Friends of Brunswick Park](http://www.friendsofbrunswickpark.co.uk/)
- [Exeter Trees](www.exetertrees.uk)
- [Wembley Park Limited](https://wembleypark.com/)
- [Washington Square Park Eco Projects](https://www.wspecoprojects.org/)
- [Coláiste Bríde Enniscorthy](https://www.colaistebride.ie/)
- [Enniscorthy Vocational College](https://www.enniscorthycc.ie/)
- [Mountshannon Arboretum](https://www.mountshannonarboretum.com/) - Forester Bernard Carey initiated the Mountshannon i-Tree project, in conjunction with UCD and UK-based consultancy Treeconomics.
- [Sidmouth Arboretum](http://sidmoutharboretum.org.uk/)
- [East Devon District Council](https://eastdevon.gov.uk/)
- [SLU](https://www.slu.se/en/) - Alnarp - Skåne Tree Inventory and support for and involvement in the New Commons and Curio Canopy projects
- [Malmö Stad](https://malmo.se/) - Malmö Tree Inventory and support for and involvement in the New Commons and Curio Canopy projects
- [Göteborgs Stad](https://goteborg.se/) -
- [Halmstad](https://www.halmstad.se/)
- [Hvilan](https://www.hvilanutbildning.se/)
- [Familjebostader](https://familjebostader.com/om-oss/)
#### Open Data Sources Attribution
- The Greater London Authority Datastore - [Local Authority Maintained Trees](https://data.london.gov.uk/dataset/local-authority-maintained-trees)
- NYC OpenData - [2015 Street Tree Census - Tree Data](https://data.cityofnewyork.us/Environment/2015-Street-Tree-Census-Tree-Data/uvpi-gqnh)
- Open Data BDN - [Street trees of the city of Barcelona](https://opendata-ajuntament.barcelona.cat/data/dataset/arbrat-viari)
- Open Data Bristol - [Trees](https://opendata.bristol.gov.uk/datasets/7a99218a4bf347ff948f0e5882406a8c)
- Open Data NI - [Belfast City Trees](https://admin.opendatani.gov.uk/dataset/belfast-trees)
- Denver Open data - [Tree Inventory](https://denvergov.org/opendata/dataset/city-and-county-of-denver-tree-inventory)
- Open Data DK - [City of Copenhagen Trees](https://www.opendata.dk/city-of-copenhagen/trae-basis-kommunale-traeer)
- Palo Alto Open Data - [Palo Alto Trees](https://data.cityofpaloalto.org/dataviews/73226/palo-alto-trees/)
- Fingal County Council Open Data - [Fingal County Council Trees](https://data.fingal.ie/maps/1e5f9db62e53443d946c15a1a06fd98b_0/explore)
- Data SA - [City of Adelaide Street Trees](https://data.sa.gov.au/data/dataset/street-trees)
- Open Data Boulder Colorado - [Tree Inventory Open Data](https://open-data.bouldercolorado.gov/datasets/dbbae8bdb0a44d17934243b88e85ef2b)
- Biodiversity Ireland - [Hertitage Trees Ireland](https://maps.biodiversityireland.ie/Dataset/27)
- Birmingham City Council Trees
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
The data is free to be used for research purposes subject to the cc-by-nc-4.0 licence and suitable attribution, please see the citation section below
Some potential uses might include;
- Investigations into urban tree biodiversity.
- The development of algorithms for extracting tree attributes via photos or streetview imagery.
- A tree species detection app.
- The detection trees of via satellite imagery.
- Species identfiication via hyperspectral tree.
It worth noting that for most use-cases cleaning, analysis and processing of data will be necessary. The completeness of tree inventory data varies greatly and users were not directed in anyway in terms of how to frame the photos they took and uploaded via the Curio app.
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
### TaggedTrees
Number of data points: 2,593,139
The details of an individual tree including its location, species, diameter at breast height (dbh), vitality etc. when available
### Images
Number of data points: 27,288
The details of images that were uploaded to the platform. The path to the actual image uploaded, this can be found in uploads directory. The details of what the image was attached to which usually was a ‘Story” that was then attached to a tree are also included.
### Uploads:
The set of images referenced in the images data file. The set of images was quite large even when zipped and so was broken up into 10gb chunks. Download each of the chunks and then run unzip on the uploads.zip file
A folder containing downsized versions of the images based on a fixed width has also been included - resized-uploads-width1200.zip
### Stories:
The details of a story that was attached to tree
### Notes:
The text included in a story/note about a tree.
### Conversations & Comments:
Comments grouped by conversations linked to a particular Story
### TreeSpecies
The tree species dictionary we built to support the platform. Each TaggedTree has a tree_species_id that references an entry in this dictionary when populated.
### TreeSpeciesAliases
The local names across multiple languages that can used to describe a species of tree contained in the TreeSpecies dictionary
### Tags and Taggings
Trees could be tagged with details such as diseased, monitored, newly_planted, apples, overhead cables etc. Anything at all really that could later be used to filter, group or identify trees of interest as well describe their state.
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
The goal of the Curio platform was to educate, engage and democratised access to environmenatal information. Making the data collected on the platform available in this form is seen as an extension of that mission.
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
All data was collected via the Curio app by its community. Where inventory data was uploaded in bulk we preprocessed the data to ensure details such as species information where mapped to the species dictionary we deinfed and that has been included in this release.
Before making the data available on this platform we decided to run face detection and blur any obvious, detectable faces found in the images that have been included.
<!-- #### Who are the source data producers? -->
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
<!-- #### Personal and Sensitive Information -->
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
<!-- ## Bias, Risks, and Limitations -->
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
@misc{CurioTreeData,
title = {The Curio Tree Dataset},
author = {Conor Nugent and Paul Hickey},
year = {2023},
publisher = {HuggingFace},
journal = {HuggingFace repository},
howpublished = {\url{https://https://huggingface.co/datasets/BreadboardLabs/CurioTreeData}},
}
## Dataset Card Authors
Conor Nugent and Paul Hickey
## Dataset Card Contact
[Conor Nugent](https://www.linkedin.com/in/conor-nugent-5b02458/?originalSubdomain=ie) | [
-0.2893421947956085,
-0.21824632585048676,
0.43411898612976074,
0.07437966018915176,
-0.22269703447818756,
0.15063895285129547,
0.2521125078201294,
-0.9941790699958801,
0.4332308769226074,
0.25387272238731384,
-0.3263305723667145,
-0.729732871055603,
-0.1971668004989624,
-0.001684380229562... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
FTi130/gsplats-storage | FTi130 | 2023-11-15T12:38:30Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T12:38:30Z | 2023-11-14T21:21:25.000Z | 2023-11-14T21:21:25 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
kushinm/gradioTest | kushinm | 2023-11-15T20:32:36Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-15T20:32:36Z | 2023-11-14T21:26:57.000Z | 2023-11-14T21:26:57 | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
cosminn82/ppay_train_nov_14 | cosminn82 | 2023-11-14T21:34:39Z | 0 | 0 | null | [
"region:us"
] | 2023-11-14T21:34:39Z | 2023-11-14T21:33:59.000Z | 2023-11-14T21:33:59 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
nateraw/humarin-chatgpt-paraphrases | nateraw | 2023-11-14T22:16:36Z | 0 | 0 | null | [
"region:us"
] | 2023-11-14T22:16:36Z | 2023-11-14T21:41:47.000Z | 2023-11-14T21:41:47 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
dvegasl/your-dataset-name | dvegasl | 2023-11-14T22:03:46Z | 0 | 0 | null | [
"region:us"
] | 2023-11-14T22:03:46Z | 2023-11-14T22:03:46.000Z | 2023-11-14T22:03:46 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
gh1407/inference_pairs_clustered | gh1407 | 2023-11-14T22:22:46Z | 0 | 0 | null | [
"region:us"
] | 2023-11-14T22:22:46Z | 2023-11-14T22:10:19.000Z | 2023-11-14T22:10:19 | ---
dataset_info:
features:
- name: original_string
dtype: string
- name: cause_split
dtype: string
- name: effect_split
dtype: string
- name: cause_summarized
dtype: string
- name: effect_summarized
dtype: string
splits:
- name: train
num_bytes: 1060959
num_examples: 1302
download_size: 490417
dataset_size: 1060959
---
# Dataset Card for "inference_pairs_clustered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.5409017205238342,
-0.5347885489463806,
0.5127117037773132,
0.46021178364753723,
-0.14746242761611938,
-0.17244818806648254,
0.16338439285755157,
-0.03178737685084343,
0.8444741368293762,
0.2908228039741516,
-0.6070426106452942,
-0.8606574535369873,
-0.49625521898269653,
-0.3565672338008... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
sruly/StepBackSearch | sruly | 2023-11-14T23:06:03Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-14T23:06:03Z | 2023-11-14T22:33:22.000Z | 2023-11-14T22:33:22 | ---
license: apache-2.0
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Avatarofhemant/News | Avatarofhemant | 2023-11-14T22:44:17Z | 0 | 0 | null | [
"task_categories:zero-shot-classification",
"task_categories:text-classification",
"language:en",
"license:apache-2.0",
"art",
"music",
"region:us"
] | 2023-11-14T22:44:17Z | 2023-11-14T22:36:34.000Z | 2023-11-14T22:36:34 | ---
license: apache-2.0
task_categories:
- zero-shot-classification
- text-classification
language:
- en
tags:
- art
- music
--- | [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
ibleducation/ibl-best-practices-instructor-dataset | ibleducation | 2023-11-14T22:50:28Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-14T22:50:28Z | 2023-11-14T22:41:15.000Z | 2023-11-14T22:41:15 | ---
license: apache-2.0
---
# ibl-best-practices-instructor-dataset
This dataset contains a selection of various prompts and appropriate responses demonstrating a conversation between a student and a teacher.
Each response is generated to match an associated virtue expected of a professional teacher
## Dataset Format
Each datapoint has three colums
- Virtue: The corresponding postive virtue we expect a tutor to demonstrate
- Prompt: A question asked by a student to a teacher
- Response: An appropriate answer to the question asked by the student. This answer exhibits the virtue specified.
## Dataset Creation
The dataset was generated from a set of documents explaining qualities of a good teacher.
- These documents were fed to an llm (GPT-3.5) to generate a list of positive virtues that a teacher must demonstrate as specified in the documents
- A second prompt is then made to generate sample conversations (prompt and response pairs) where the response from the teacher clearly demonstrates the virtue specified.
| [
-0.6265634894371033,
-0.3631342947483063,
0.26316866278648376,
-0.013606064952909946,
-0.39510029554367065,
-0.3311018645763397,
0.06294485181570053,
-0.13876979053020477,
-0.17016206681728363,
0.6144702434539795,
-0.3615040183067322,
-0.6501511335372925,
-0.4365485608577728,
0.19439615309... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
AnilKamat/SongAlbumCover | AnilKamat | 2023-11-14T22:50:15Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-14T22:50:15Z | 2023-11-14T22:47:38.000Z | 2023-11-14T22:47:38 | ---
license: mit
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_teknium__OpenHermes-2.5-Mistral-7B_public | open-llm-leaderboard | 2023-11-20T08:23:35Z | 0 | 0 | null | [
"region:us"
] | 2023-11-20T08:23:35Z | 2023-11-14T22:47:46.000Z | 2023-11-14T22:47:46 | ---
pretty_name: Evaluation run of teknium/OpenHermes-2.5-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [teknium/OpenHermes-2.5-Mistral-7B](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_teknium__OpenHermes-2.5-Mistral-7B_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-20T08:19:51.425757](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__OpenHermes-2.5-Mistral-7B_public/blob/main/results_2023-11-20T08-19-51.425757.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.632448279866198,\n\
\ \"acc_stderr\": 0.032234779746923035,\n \"acc_norm\": 0.6400703538735191,\n\
\ \"acc_norm_stderr\": 0.03290159609429692,\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.5223798200841158,\n\
\ \"mc2_stderr\": 0.015238875045496005,\n \"em\": 0.30683724832214765,\n\
\ \"em_stderr\": 0.004722927724288475,\n \"f1\": 0.35794463087248424,\n\
\ \"f1_stderr\": 0.004623146117214951\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6126279863481229,\n \"acc_stderr\": 0.014235872487909869,\n\
\ \"acc_norm\": 0.6493174061433447,\n \"acc_norm_stderr\": 0.013944635930726099\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6522605058753237,\n\
\ \"acc_stderr\": 0.004752794829825045,\n \"acc_norm\": 0.8417645887273452,\n\
\ \"acc_norm_stderr\": 0.0036421571661623443\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n\
\ \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n\
\ \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n\
\ \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n \"\
acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4365079365079365,\n \"acc_stderr\": 0.0255428468174005,\n \"acc_norm\"\
: 0.4365079365079365,\n \"acc_norm_stderr\": 0.0255428468174005\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\
\ \"acc_stderr\": 0.02302589961718871,\n \"acc_norm\": 0.7935483870967742,\n\
\ \"acc_norm_stderr\": 0.02302589961718871\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483016,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483016\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\"\
: 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130956,\n\
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130956\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02784081149587193,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02784081149587193\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886797,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886797\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530343,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530343\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313728,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313728\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.04742762361243011,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.04742762361243011\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834832,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834832\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468355,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468355\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3106145251396648,\n\
\ \"acc_stderr\": 0.015476515438005566,\n \"acc_norm\": 0.3106145251396648,\n\
\ \"acc_norm_stderr\": 0.015476515438005566\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140446,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140446\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"\
acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507215,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507215\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744543,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744543\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.027686913588013024,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.027686913588013024\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.5223798200841158,\n\
\ \"mc2_stderr\": 0.015238875045496005\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.30683724832214765,\n \
\ \"em_stderr\": 0.004722927724288475,\n \"f1\": 0.35794463087248424,\n\
\ \"f1_stderr\": 0.004623146117214951\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.2608036391205459,\n \"acc_stderr\": 0.012094252417332745\n\
\ }\n}\n```"
repo_url: https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|arc:challenge|25_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|arc:challenge|25_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|drop|3_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|drop|3_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|gsm8k|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|gsm8k|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hellaswag|10_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hellaswag|10_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-14T22-44-46.514057.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-20T08-19-51.425757.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-20T08-19-51.425757.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- '**/details_harness|winogrande|5_2023-11-14T22-44-46.514057.parquet'
- split: 2023_11_20T08_19_51.425757
path:
- '**/details_harness|winogrande|5_2023-11-20T08-19-51.425757.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-20T08-19-51.425757.parquet'
- config_name: results
data_files:
- split: 2023_11_14T22_44_46.514057
path:
- results_2023-11-14T22-44-46.514057.parquet
- split: 2023_11_20T08_19_51.425757
path:
- results_2023-11-20T08-19-51.425757.parquet
- split: latest
path:
- results_2023-11-20T08-19-51.425757.parquet
---
# Dataset Card for Evaluation run of teknium/OpenHermes-2.5-Mistral-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [teknium/OpenHermes-2.5-Mistral-7B](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_teknium__OpenHermes-2.5-Mistral-7B_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-20T08:19:51.425757](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__OpenHermes-2.5-Mistral-7B_public/blob/main/results_2023-11-20T08-19-51.425757.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.632448279866198,
"acc_stderr": 0.032234779746923035,
"acc_norm": 0.6400703538735191,
"acc_norm_stderr": 0.03290159609429692,
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836882,
"mc2": 0.5223798200841158,
"mc2_stderr": 0.015238875045496005,
"em": 0.30683724832214765,
"em_stderr": 0.004722927724288475,
"f1": 0.35794463087248424,
"f1_stderr": 0.004623146117214951
},
"harness|arc:challenge|25": {
"acc": 0.6126279863481229,
"acc_stderr": 0.014235872487909869,
"acc_norm": 0.6493174061433447,
"acc_norm_stderr": 0.013944635930726099
},
"harness|hellaswag|10": {
"acc": 0.6522605058753237,
"acc_stderr": 0.004752794829825045,
"acc_norm": 0.8417645887273452,
"acc_norm_stderr": 0.0036421571661623443
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.0255428468174005,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.0255428468174005
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718871,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718871
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483016,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483016
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.02833560973246336,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.02833560973246336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130956,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130956
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02784081149587193,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02784081149587193
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886797,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886797
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530343,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313728,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313728
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.04742762361243011,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.04742762361243011
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834832,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834832
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468355,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468355
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3106145251396648,
"acc_stderr": 0.015476515438005566,
"acc_norm": 0.3106145251396648,
"acc_norm_stderr": 0.015476515438005566
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140446,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140446
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507215,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507215
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.027979823538744543,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.027979823538744543
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013024,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013024
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836882,
"mc2": 0.5223798200841158,
"mc2_stderr": 0.015238875045496005
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
},
"harness|drop|3": {
"em": 0.30683724832214765,
"em_stderr": 0.004722927724288475,
"f1": 0.35794463087248424,
"f1_stderr": 0.004623146117214951
},
"harness|gsm8k|5": {
"acc": 0.2608036391205459,
"acc_stderr": 0.012094252417332745
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7161073088645935,
-0.8484865427017212,
0.27123913168907166,
0.16766472160816193,
-0.19044552743434906,
-0.058948587626218796,
-0.023389644920825958,
-0.19104664027690887,
0.5454065799713135,
-0.051905445754528046,
-0.4623368978500366,
-0.7545487880706787,
-0.4379877746105194,
0.24470670... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Abcdefghijklmnopqrstuvwxyz12/Aespa | Abcdefghijklmnopqrstuvwxyz12 | 2023-11-27T23:39:50Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-27T23:39:50Z | 2023-11-14T22:53:00.000Z | 2023-11-14T22:53:00 | ---
license: apache-2.0
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Sammelgro/control_concepts | Sammelgro | 2023-11-14T23:20:31Z | 0 | 0 | null | [
"license:llama2",
"region:us"
] | 2023-11-14T23:20:31Z | 2023-11-14T23:20:31.000Z | 2023-11-14T23:20:31 | ---
license: llama2
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
zeio/batch-speech | zeio | 2023-11-14T22:51:33Z | 0 | 0 | null | [
"region:us"
] | 2023-11-14T22:51:33Z | 2023-11-14T23:43:12.000Z | 2023-11-14T23:43:12 | # Batch speech
TODO: add description
| [
-0.47497114539146423,
-0.25549405813217163,
0.33636152744293213,
0.730536699295044,
-0.5326333045959473,
-0.020199213176965714,
-0.3906959295272827,
-0.044297460466623306,
0.7518500089645386,
0.9255887866020203,
-0.647126317024231,
-0.36276450753211975,
-0.678122341632843,
-0.4180499613285... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
syntag/company | syntag | 2023-11-14T23:56:51Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-14T23:56:51Z | 2023-11-14T23:56:51.000Z | 2023-11-14T23:56:51 | ---
license: mit
---
| [
-0.1285337507724762,
-0.18616773188114166,
0.6529127359390259,
0.4943627715110779,
-0.193193256855011,
0.23607444763183594,
0.36071985960006714,
0.050563156604766846,
0.5793652534484863,
0.7400138974189758,
-0.6508103013038635,
-0.23783966898918152,
-0.7102247476577759,
-0.0478259548544883... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Manuja2008/EduPro | Manuja2008 | 2023-11-14T23:58:16Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-14T23:58:16Z | 2023-11-14T23:58:16.000Z | 2023-11-14T23:58:16 | ---
license: mit
---
| [
-0.1285337507724762,
-0.18616773188114166,
0.6529127359390259,
0.4943627715110779,
-0.193193256855011,
0.23607444763183594,
0.36071985960006714,
0.050563156604766846,
0.5793652534484863,
0.7400138974189758,
-0.6508103013038635,
-0.23783966898918152,
-0.7102247476577759,
-0.0478259548544883... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
brandon12333/otis.dataset | brandon12333 | 2023-11-15T00:12:44Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-15T00:12:44Z | 2023-11-15T00:08:31.000Z | 2023-11-15T00:08:31 | ---
license: apache-2.0
---
| [
-0.1285337507724762,
-0.18616773188114166,
0.6529127359390259,
0.4943627715110779,
-0.193193256855011,
0.23607444763183594,
0.36071985960006714,
0.050563156604766846,
0.5793652534484863,
0.7400138974189758,
-0.6508103013038635,
-0.23783966898918152,
-0.7102247476577759,
-0.0478259548544883... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
jlbaker361/test | jlbaker361 | 2023-11-15T00:26:42Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T00:26:42Z | 2023-11-15T00:26:41.000Z | 2023-11-15T00:26:41 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 231.7
num_examples: 7
- name: test
num_bytes: 99.3
num_examples: 3
download_size: 3607
dataset_size: 331.0
---
# Dataset Card for "test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.6750994920730591,
-0.41924890875816345,
0.0809972956776619,
0.19183474779129028,
-0.1334189921617508,
0.008788548409938812,
0.24120904505252838,
-0.13406118750572205,
0.7381706237792969,
0.33343809843063354,
-0.8203107714653015,
-0.6550990343093872,
-0.47400251030921936,
-0.187489449977... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_sequelbox__DaringFortitude_public | open-llm-leaderboard | 2023-11-15T00:39:41Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T00:39:41Z | 2023-11-15T00:38:54.000Z | 2023-11-15T00:38:54 | ---
pretty_name: Evaluation run of sequelbox/DaringFortitude
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [sequelbox/DaringFortitude](https://huggingface.co/sequelbox/DaringFortitude)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sequelbox__DaringFortitude_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-15T00:35:47.431209](https://huggingface.co/datasets/open-llm-leaderboard/details_sequelbox__DaringFortitude_public/blob/main/results_2023-11-15T00-35-47.431209.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5932217761298214,\n\
\ \"acc_stderr\": 0.03305656216343214,\n \"acc_norm\": 0.6027951864354921,\n\
\ \"acc_norm_stderr\": 0.03382034227909779,\n \"mc1\": 0.40269277845777235,\n\
\ \"mc1_stderr\": 0.017168830935187215,\n \"mc2\": 0.559561930249219,\n\
\ \"mc2_stderr\": 0.015693079433704838,\n \"em\": 0.01950503355704698,\n\
\ \"em_stderr\": 0.0014162361849700607,\n \"f1\": 0.12218750000000013,\n\
\ \"f1_stderr\": 0.002284380268622334\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6032423208191127,\n \"acc_stderr\": 0.01429651302018063,\n\
\ \"acc_norm\": 0.6348122866894198,\n \"acc_norm_stderr\": 0.014070265519268802\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6360286795459071,\n\
\ \"acc_stderr\": 0.004801572028920796,\n \"acc_norm\": 0.8355905198167696,\n\
\ \"acc_norm_stderr\": 0.003698892388380099\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319616,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319616\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.03268335899936336,\n\
\ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.03268335899936336\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.04514496132873634,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.04514496132873634\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819067,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819067\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n\
\ \"acc_stderr\": 0.026729499068349958,\n \"acc_norm\": 0.6709677419354839,\n\
\ \"acc_norm_stderr\": 0.026729499068349958\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187222,\n\
\ \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187222\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7889908256880734,\n \"acc_stderr\": 0.017493922404112648,\n \"\
acc_norm\": 0.7889908256880734,\n \"acc_norm_stderr\": 0.017493922404112648\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.02615686752393104,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02615686752393104\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.02336505149175372,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.02336505149175372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7956577266922095,\n\
\ \"acc_stderr\": 0.0144191239809319,\n \"acc_norm\": 0.7956577266922095,\n\
\ \"acc_norm_stderr\": 0.0144191239809319\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.02557412378654667,\n\
\ \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.02557412378654667\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48268156424581005,\n\
\ \"acc_stderr\": 0.01671246744170252,\n \"acc_norm\": 0.48268156424581005,\n\
\ \"acc_norm_stderr\": 0.01671246744170252\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.026925654653615693,\n\
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.026925654653615693\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45827900912646674,\n\
\ \"acc_stderr\": 0.012725701656953642,\n \"acc_norm\": 0.45827900912646674,\n\
\ \"acc_norm_stderr\": 0.012725701656953642\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.0296246635811597,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.0296246635811597\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5915032679738562,\n \"acc_stderr\": 0.01988622103750187,\n \
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.01988622103750187\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.03002105623844031,\n\
\ \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.03002105623844031\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.030360490154014645,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.030360490154014645\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40269277845777235,\n\
\ \"mc1_stderr\": 0.017168830935187215,\n \"mc2\": 0.559561930249219,\n\
\ \"mc2_stderr\": 0.015693079433704838\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650865\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.01950503355704698,\n \
\ \"em_stderr\": 0.0014162361849700607,\n \"f1\": 0.12218750000000013,\n\
\ \"f1_stderr\": 0.002284380268622334\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.08794541319181198,\n \"acc_stderr\": 0.007801162197487721\n\
\ }\n}\n```"
repo_url: https://huggingface.co/sequelbox/DaringFortitude
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|arc:challenge|25_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|drop|3_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|gsm8k|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hellaswag|10_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|winogrande|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-15T00-35-47.431209.parquet'
- config_name: results
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- results_2023-11-15T00-35-47.431209.parquet
- split: latest
path:
- results_2023-11-15T00-35-47.431209.parquet
---
# Dataset Card for Evaluation run of sequelbox/DaringFortitude
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/sequelbox/DaringFortitude
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [sequelbox/DaringFortitude](https://huggingface.co/sequelbox/DaringFortitude) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sequelbox__DaringFortitude_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-15T00:35:47.431209](https://huggingface.co/datasets/open-llm-leaderboard/details_sequelbox__DaringFortitude_public/blob/main/results_2023-11-15T00-35-47.431209.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5932217761298214,
"acc_stderr": 0.03305656216343214,
"acc_norm": 0.6027951864354921,
"acc_norm_stderr": 0.03382034227909779,
"mc1": 0.40269277845777235,
"mc1_stderr": 0.017168830935187215,
"mc2": 0.559561930249219,
"mc2_stderr": 0.015693079433704838,
"em": 0.01950503355704698,
"em_stderr": 0.0014162361849700607,
"f1": 0.12218750000000013,
"f1_stderr": 0.002284380268622334
},
"harness|arc:challenge|25": {
"acc": 0.6032423208191127,
"acc_stderr": 0.01429651302018063,
"acc_norm": 0.6348122866894198,
"acc_norm_stderr": 0.014070265519268802
},
"harness|hellaswag|10": {
"acc": 0.6360286795459071,
"acc_stderr": 0.004801572028920796,
"acc_norm": 0.8355905198167696,
"acc_norm_stderr": 0.003698892388380099
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796005,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796005
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319616,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.03268335899936336,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.03268335899936336
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873634,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873634
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819067,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819067
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.026729499068349958,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.026729499068349958
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187222,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187222
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7889908256880734,
"acc_stderr": 0.017493922404112648,
"acc_norm": 0.7889908256880734,
"acc_norm_stderr": 0.017493922404112648
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02615686752393104,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02615686752393104
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908706,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908706
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724146,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724146
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.02336505149175372,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.02336505149175372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7956577266922095,
"acc_stderr": 0.0144191239809319,
"acc_norm": 0.7956577266922095,
"acc_norm_stderr": 0.0144191239809319
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.02557412378654667,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.02557412378654667
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48268156424581005,
"acc_stderr": 0.01671246744170252,
"acc_norm": 0.48268156424581005,
"acc_norm_stderr": 0.01671246744170252
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.026925654653615693,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.026925654653615693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45827900912646674,
"acc_stderr": 0.012725701656953642,
"acc_norm": 0.45827900912646674,
"acc_norm_stderr": 0.012725701656953642
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.0296246635811597,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.0296246635811597
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.01988622103750187,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.01988622103750187
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.03002105623844031,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.03002105623844031
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014645,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014645
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40269277845777235,
"mc1_stderr": 0.017168830935187215,
"mc2": 0.559561930249219,
"mc2_stderr": 0.015693079433704838
},
"harness|winogrande|5": {
"acc": 0.7647987371744278,
"acc_stderr": 0.011920008163650865
},
"harness|drop|3": {
"em": 0.01950503355704698,
"em_stderr": 0.0014162361849700607,
"f1": 0.12218750000000013,
"f1_stderr": 0.002284380268622334
},
"harness|gsm8k|5": {
"acc": 0.08794541319181198,
"acc_stderr": 0.007801162197487721
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.6601749062538147,
-0.8284251689910889,
0.2756709158420563,
0.18555501103401184,
-0.16714918613433838,
-0.013734507374465466,
0.09642694890499115,
-0.15291006863117218,
0.5148459076881409,
-0.04294617846608162,
-0.5153998136520386,
-0.6921094655990601,
-0.45460832118988037,
0.20895366370... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_ValiantLabs__ShiningValiantXS_public | open-llm-leaderboard | 2023-11-15T00:42:15Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T00:42:15Z | 2023-11-15T00:41:28.000Z | 2023-11-15T00:41:28 | ---
pretty_name: Evaluation run of ValiantLabs/ShiningValiantXS
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ValiantLabs/ShiningValiantXS](https://huggingface.co/ValiantLabs/ShiningValiantXS)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ValiantLabs__ShiningValiantXS_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-15T00:38:24.676368](https://huggingface.co/datasets/open-llm-leaderboard/details_ValiantLabs__ShiningValiantXS_public/blob/main/results_2023-11-15T00-38-24.676368.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5932217761298214,\n\
\ \"acc_stderr\": 0.03305656216343214,\n \"acc_norm\": 0.6027951864354921,\n\
\ \"acc_norm_stderr\": 0.03382034227909779,\n \"mc1\": 0.40269277845777235,\n\
\ \"mc1_stderr\": 0.017168830935187215,\n \"mc2\": 0.559561930249219,\n\
\ \"mc2_stderr\": 0.015693079433704838,\n \"em\": 0.01950503355704698,\n\
\ \"em_stderr\": 0.0014162361849700607,\n \"f1\": 0.12218750000000013,\n\
\ \"f1_stderr\": 0.002284380268622334\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6032423208191127,\n \"acc_stderr\": 0.01429651302018063,\n\
\ \"acc_norm\": 0.6348122866894198,\n \"acc_norm_stderr\": 0.014070265519268802\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6360286795459071,\n\
\ \"acc_stderr\": 0.004801572028920796,\n \"acc_norm\": 0.8355905198167696,\n\
\ \"acc_norm_stderr\": 0.003698892388380099\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319616,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319616\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.03268335899936336,\n\
\ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.03268335899936336\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.04514496132873634,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.04514496132873634\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819067,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819067\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n\
\ \"acc_stderr\": 0.026729499068349958,\n \"acc_norm\": 0.6709677419354839,\n\
\ \"acc_norm_stderr\": 0.026729499068349958\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187222,\n\
\ \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187222\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7889908256880734,\n \"acc_stderr\": 0.017493922404112648,\n \"\
acc_norm\": 0.7889908256880734,\n \"acc_norm_stderr\": 0.017493922404112648\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.02615686752393104,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02615686752393104\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.02336505149175372,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.02336505149175372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7956577266922095,\n\
\ \"acc_stderr\": 0.0144191239809319,\n \"acc_norm\": 0.7956577266922095,\n\
\ \"acc_norm_stderr\": 0.0144191239809319\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.02557412378654667,\n\
\ \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.02557412378654667\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48268156424581005,\n\
\ \"acc_stderr\": 0.01671246744170252,\n \"acc_norm\": 0.48268156424581005,\n\
\ \"acc_norm_stderr\": 0.01671246744170252\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.026925654653615693,\n\
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.026925654653615693\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45827900912646674,\n\
\ \"acc_stderr\": 0.012725701656953642,\n \"acc_norm\": 0.45827900912646674,\n\
\ \"acc_norm_stderr\": 0.012725701656953642\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.0296246635811597,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.0296246635811597\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5915032679738562,\n \"acc_stderr\": 0.01988622103750187,\n \
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.01988622103750187\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.03002105623844031,\n\
\ \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.03002105623844031\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.030360490154014645,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.030360490154014645\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40269277845777235,\n\
\ \"mc1_stderr\": 0.017168830935187215,\n \"mc2\": 0.559561930249219,\n\
\ \"mc2_stderr\": 0.015693079433704838\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650865\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.01950503355704698,\n \
\ \"em_stderr\": 0.0014162361849700607,\n \"f1\": 0.12218750000000013,\n\
\ \"f1_stderr\": 0.002284380268622334\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.08794541319181198,\n \"acc_stderr\": 0.007801162197487721\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ValiantLabs/ShiningValiantXS
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|arc:challenge|25_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|drop|3_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|gsm8k|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hellaswag|10_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T00-38-24.676368.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-15T00-38-24.676368.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- '**/details_harness|winogrande|5_2023-11-15T00-38-24.676368.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-15T00-38-24.676368.parquet'
- config_name: results
data_files:
- split: 2023_11_15T00_38_24.676368
path:
- results_2023-11-15T00-38-24.676368.parquet
- split: latest
path:
- results_2023-11-15T00-38-24.676368.parquet
---
# Dataset Card for Evaluation run of ValiantLabs/ShiningValiantXS
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ValiantLabs/ShiningValiantXS
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ValiantLabs/ShiningValiantXS](https://huggingface.co/ValiantLabs/ShiningValiantXS) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ValiantLabs__ShiningValiantXS_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-15T00:38:24.676368](https://huggingface.co/datasets/open-llm-leaderboard/details_ValiantLabs__ShiningValiantXS_public/blob/main/results_2023-11-15T00-38-24.676368.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5932217761298214,
"acc_stderr": 0.03305656216343214,
"acc_norm": 0.6027951864354921,
"acc_norm_stderr": 0.03382034227909779,
"mc1": 0.40269277845777235,
"mc1_stderr": 0.017168830935187215,
"mc2": 0.559561930249219,
"mc2_stderr": 0.015693079433704838,
"em": 0.01950503355704698,
"em_stderr": 0.0014162361849700607,
"f1": 0.12218750000000013,
"f1_stderr": 0.002284380268622334
},
"harness|arc:challenge|25": {
"acc": 0.6032423208191127,
"acc_stderr": 0.01429651302018063,
"acc_norm": 0.6348122866894198,
"acc_norm_stderr": 0.014070265519268802
},
"harness|hellaswag|10": {
"acc": 0.6360286795459071,
"acc_stderr": 0.004801572028920796,
"acc_norm": 0.8355905198167696,
"acc_norm_stderr": 0.003698892388380099
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796005,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796005
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319616,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.03268335899936336,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.03268335899936336
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873634,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873634
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819067,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819067
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.026729499068349958,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.026729499068349958
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187222,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187222
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7889908256880734,
"acc_stderr": 0.017493922404112648,
"acc_norm": 0.7889908256880734,
"acc_norm_stderr": 0.017493922404112648
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02615686752393104,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02615686752393104
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908706,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908706
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724146,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724146
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.02336505149175372,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.02336505149175372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7956577266922095,
"acc_stderr": 0.0144191239809319,
"acc_norm": 0.7956577266922095,
"acc_norm_stderr": 0.0144191239809319
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.02557412378654667,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.02557412378654667
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48268156424581005,
"acc_stderr": 0.01671246744170252,
"acc_norm": 0.48268156424581005,
"acc_norm_stderr": 0.01671246744170252
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.026925654653615693,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.026925654653615693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45827900912646674,
"acc_stderr": 0.012725701656953642,
"acc_norm": 0.45827900912646674,
"acc_norm_stderr": 0.012725701656953642
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.0296246635811597,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.0296246635811597
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.01988622103750187,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.01988622103750187
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.03002105623844031,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.03002105623844031
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014645,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014645
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40269277845777235,
"mc1_stderr": 0.017168830935187215,
"mc2": 0.559561930249219,
"mc2_stderr": 0.015693079433704838
},
"harness|winogrande|5": {
"acc": 0.7647987371744278,
"acc_stderr": 0.011920008163650865
},
"harness|drop|3": {
"em": 0.01950503355704698,
"em_stderr": 0.0014162361849700607,
"f1": 0.12218750000000013,
"f1_stderr": 0.002284380268622334
},
"harness|gsm8k|5": {
"acc": 0.08794541319181198,
"acc_stderr": 0.007801162197487721
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.6881926655769348,
-0.8044857978820801,
0.28336524963378906,
0.22669494152069092,
-0.16508561372756958,
-0.04238898307085037,
0.02037423476576805,
-0.17043693363666534,
0.5794963836669922,
-0.034220028668642044,
-0.4887380003929138,
-0.7130363583564758,
-0.4334738254547119,
0.23511680960... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
ashishpvjs/hf-codegen | ashishpvjs | 2023-11-15T02:16:54Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T02:16:54Z | 2023-11-15T01:14:21.000Z | 2023-11-15T01:14:21 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
GiovanniHD/AmiMizuno | GiovanniHD | 2023-11-15T01:18:49Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-15T01:18:49Z | 2023-11-15T01:16:13.000Z | 2023-11-15T01:16:13 | ---
license: openrail
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
ErhaChen/cartoon_room | ErhaChen | 2023-11-15T01:37:08Z | 0 | 0 | null | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:apache-2.0",
"cartoon room",
"style",
"region:us"
] | 2023-11-15T01:37:08Z | 2023-11-15T01:35:26.000Z | 2023-11-15T01:35:26 | ---
license: apache-2.0
task_categories:
- text-to-image
tags:
- cartoon room
- style
size_categories:
- n<1K
--- | [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
antonshenk/RAND-wizmap | antonshenk | 2023-11-15T01:54:35Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T01:54:35Z | 2023-11-15T01:43:29.000Z | 2023-11-15T01:43:29 | Entry not found | [
-0.32276472449302673,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965679168701,
0.7915717363357544,
0.07618629932403564,
0.7746022939682007,
0.2563222646713257,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Rewcifer/best_outputs_selected_50_3model | Rewcifer | 2023-11-15T01:44:45Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T01:44:45Z | 2023-11-15T01:44:44.000Z | 2023-11-15T01:44:44 | ---
dataset_info:
features:
- name: true_findings
dtype: string
- name: generated_texts_1
dtype: string
- name: generated_texts_2
dtype: string
- name: generated_texts_3
dtype: string
splits:
- name: train
num_bytes: 108090.4181184669
num_examples: 50
download_size: 83094
dataset_size: 108090.4181184669
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "best_outputs_selected_50_3model"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.6520466804504395,
0.04668660834431648,
0.3308957517147064,
0.3125123381614685,
-0.2432822287082672,
-0.2464945763349533,
0.38798511028289795,
0.08926510065793991,
0.5439279079437256,
0.5153292417526245,
-0.6266983151435852,
-0.7637103796005249,
-0.6761879324913025,
-0.028744332492351532... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
furry-br/amy | furry-br | 2023-11-15T02:00:21Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-15T02:00:21Z | 2023-11-15T01:59:56.000Z | 2023-11-15T01:59:56 | ---
license: openrail
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
sonia12138/lyrics-emotion | sonia12138 | 2023-11-15T03:20:42Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-15T03:20:42Z | 2023-11-15T02:53:43.000Z | 2023-11-15T02:53:43 | ---
license: mit
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
v2ray/Tony-Chase-Transcripts | v2ray | 2023-11-15T03:46:12Z | 0 | 0 | null | [
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:en",
"license:mit",
"not-for-all-audiences",
"region:us"
] | 2023-11-15T03:46:12Z | 2023-11-15T03:26:39.000Z | 2023-11-15T03:26:39 | ---
license: mit
task_categories:
- text-generation
language:
- en
tags:
- not-for-all-audiences
size_categories:
- 1K<n<10K
---
# Tony Chase Transcripts
Around 3500 transcripts of videos from [Tony Chase](https://www.youtube.com/@tonychaseiamnotgaynotnever2043) captioned with GPT-3.5-Turbo. | [
-0.33819353580474854,
-0.09624585509300232,
0.7944098711013794,
0.32815536856651306,
-0.15091200172901154,
0.5331977009773254,
0.14524734020233154,
0.09239725768566132,
0.34119582176208496,
0.2305729240179062,
-0.8400949835777283,
-0.02334209345281124,
-0.6459415555000305,
-0.2094101309776... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
xDAN-datasets/Chinese-medical-dialogue-799k | xDAN-datasets | 2023-11-15T03:34:33Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T03:34:33Z | 2023-11-15T03:30:36.000Z | 2023-11-15T03:30:36 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1117445166
num_examples: 799743
download_size: 571985574
dataset_size: 1117445166
---
# Dataset Card for "Chinese-medical-dialogue-799k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.14304684102535248,
-0.0972403958439827,
0.3756391406059265,
0.3363245129585266,
-0.46487605571746826,
-0.12259471416473389,
0.03228720650076866,
-0.15672631561756134,
0.8395463228225708,
0.5107381343841553,
-0.7687605619430542,
-0.8030067682266235,
-0.4838924705982208,
-0.13180686533451... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_lgaalves__gpt2-xl_lima_public | open-llm-leaderboard | 2023-11-15T03:48:58Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T03:48:58Z | 2023-11-15T03:48:13.000Z | 2023-11-15T03:48:13 | ---
pretty_name: Evaluation run of lgaalves/gpt2-xl_lima
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lgaalves/gpt2-xl_lima](https://huggingface.co/lgaalves/gpt2-xl_lima) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lgaalves__gpt2-xl_lima_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-15T03:46:31.104311](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt2-xl_lima_public/blob/main/results_2023-11-15T03-46-31.104311.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2579848503192349,\n\
\ \"acc_stderr\": 0.030758432385023834,\n \"acc_norm\": 0.25961199994409145,\n\
\ \"acc_norm_stderr\": 0.03153372055003476,\n \"mc1\": 0.2252141982864137,\n\
\ \"mc1_stderr\": 0.014623240768023507,\n \"mc2\": 0.3874325444900457,\n\
\ \"mc2_stderr\": 0.014089660369122726,\n \"em\": 0.002726510067114094,\n\
\ \"em_stderr\": 0.0005340111700415908,\n \"f1\": 0.04890100671140956,\n\
\ \"f1_stderr\": 0.0013085576550093093\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2645051194539249,\n \"acc_stderr\": 0.012889272949313368,\n\
\ \"acc_norm\": 0.31143344709897613,\n \"acc_norm_stderr\": 0.013532472099850949\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.39842660824536946,\n\
\ \"acc_stderr\": 0.004885735963346903,\n \"acc_norm\": 0.5128460466042621,\n\
\ \"acc_norm_stderr\": 0.004988134303021793\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n\
\ \"acc_stderr\": 0.03712537833614865,\n \"acc_norm\": 0.24444444444444444,\n\
\ \"acc_norm_stderr\": 0.03712537833614865\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.033911609343436025,\n\
\ \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.033911609343436025\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.30566037735849055,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.30566037735849055,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.037455547914624576,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.037455547914624576\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3352601156069364,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.3352601156069364,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.13725490196078433,\n \"acc_stderr\": 0.03424084669891522,\n\
\ \"acc_norm\": 0.13725490196078433,\n \"acc_norm_stderr\": 0.03424084669891522\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.251063829787234,\n \"acc_stderr\": 0.028346963777162445,\n\
\ \"acc_norm\": 0.251063829787234,\n \"acc_norm_stderr\": 0.028346963777162445\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.03752833958003336,\n\
\ \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.03752833958003336\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20105820105820105,\n \"acc_stderr\": 0.020641810782370165,\n \"\
acc_norm\": 0.20105820105820105,\n \"acc_norm_stderr\": 0.020641810782370165\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.041049472699033945,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.041049472699033945\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.20967741935483872,\n\
\ \"acc_stderr\": 0.02315787934908352,\n \"acc_norm\": 0.20967741935483872,\n\
\ \"acc_norm_stderr\": 0.02315787934908352\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.15763546798029557,\n \"acc_stderr\": 0.025639014131172408,\n\
\ \"acc_norm\": 0.15763546798029557,\n \"acc_norm_stderr\": 0.025639014131172408\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3151515151515151,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.3151515151515151,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3383838383838384,\n \"acc_stderr\": 0.03371124142626302,\n \"\
acc_norm\": 0.3383838383838384,\n \"acc_norm_stderr\": 0.03371124142626302\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.03027690994517825,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.03027690994517825\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3564102564102564,\n \"acc_stderr\": 0.024283140529467295,\n\
\ \"acc_norm\": 0.3564102564102564,\n \"acc_norm_stderr\": 0.024283140529467295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655078,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655078\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.02720537153827948,\n \
\ \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.02720537153827948\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3522935779816514,\n \"acc_stderr\": 0.020480568843999,\n \"acc_norm\"\
: 0.3522935779816514,\n \"acc_norm_stderr\": 0.020480568843999\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4537037037037037,\n\
\ \"acc_stderr\": 0.033953227263757976,\n \"acc_norm\": 0.4537037037037037,\n\
\ \"acc_norm_stderr\": 0.033953227263757976\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.20098039215686275,\n \"acc_stderr\": 0.028125972265654373,\n\
\ \"acc_norm\": 0.20098039215686275,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.20675105485232068,\n \"acc_stderr\": 0.026361651668389094,\n \
\ \"acc_norm\": 0.20675105485232068,\n \"acc_norm_stderr\": 0.026361651668389094\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.14349775784753363,\n\
\ \"acc_stderr\": 0.0235293712696182,\n \"acc_norm\": 0.14349775784753363,\n\
\ \"acc_norm_stderr\": 0.0235293712696182\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.15702479338842976,\n \"acc_stderr\": 0.0332124484254713,\n \"\
acc_norm\": 0.15702479338842976,\n \"acc_norm_stderr\": 0.0332124484254713\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.27607361963190186,\n \"acc_stderr\": 0.0351238528370505,\n\
\ \"acc_norm\": 0.27607361963190186,\n \"acc_norm_stderr\": 0.0351238528370505\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404565,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404565\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.27184466019417475,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.27184466019417475,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23931623931623933,\n\
\ \"acc_stderr\": 0.02795182680892433,\n \"acc_norm\": 0.23931623931623933,\n\
\ \"acc_norm_stderr\": 0.02795182680892433\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26181353767560667,\n\
\ \"acc_stderr\": 0.015720838678445256,\n \"acc_norm\": 0.26181353767560667,\n\
\ \"acc_norm_stderr\": 0.015720838678445256\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351294,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351294\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2057877813504823,\n\
\ \"acc_stderr\": 0.022961339906764244,\n \"acc_norm\": 0.2057877813504823,\n\
\ \"acc_norm_stderr\": 0.022961339906764244\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25617283950617287,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.25617283950617287,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23049645390070922,\n \"acc_stderr\": 0.02512373922687241,\n \
\ \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.02512373922687241\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24119947848761408,\n\
\ \"acc_stderr\": 0.010926496102034956,\n \"acc_norm\": 0.24119947848761408,\n\
\ \"acc_norm_stderr\": 0.010926496102034956\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.19852941176470587,\n \"acc_stderr\": 0.024231013370541107,\n\
\ \"acc_norm\": 0.19852941176470587,\n \"acc_norm_stderr\": 0.024231013370541107\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2434640522875817,\n \"acc_stderr\": 0.017362473762146634,\n \
\ \"acc_norm\": 0.2434640522875817,\n \"acc_norm_stderr\": 0.017362473762146634\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2727272727272727,\n\
\ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.2727272727272727,\n\
\ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2163265306122449,\n \"acc_stderr\": 0.02635891633490403,\n\
\ \"acc_norm\": 0.2163265306122449,\n \"acc_norm_stderr\": 0.02635891633490403\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\
\ \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.2537313432835821,\n\
\ \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.26506024096385544,\n\
\ \"acc_stderr\": 0.03436024037944967,\n \"acc_norm\": 0.26506024096385544,\n\
\ \"acc_norm_stderr\": 0.03436024037944967\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2252141982864137,\n\
\ \"mc1_stderr\": 0.014623240768023507,\n \"mc2\": 0.3874325444900457,\n\
\ \"mc2_stderr\": 0.014089660369122726\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5722178374112076,\n \"acc_stderr\": 0.013905134013839943\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.002726510067114094,\n \
\ \"em_stderr\": 0.0005340111700415908,\n \"f1\": 0.04890100671140956,\n\
\ \"f1_stderr\": 0.0013085576550093093\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.009097801364670205,\n \"acc_stderr\": 0.002615326510775673\n\
\ }\n}\n```"
repo_url: https://huggingface.co/lgaalves/gpt2-xl_lima
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|arc:challenge|25_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|drop|3_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|gsm8k|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hellaswag|10_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T03-46-31.104311.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-15T03-46-31.104311.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- '**/details_harness|winogrande|5_2023-11-15T03-46-31.104311.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-15T03-46-31.104311.parquet'
- config_name: results
data_files:
- split: 2023_11_15T03_46_31.104311
path:
- results_2023-11-15T03-46-31.104311.parquet
- split: latest
path:
- results_2023-11-15T03-46-31.104311.parquet
---
# Dataset Card for Evaluation run of lgaalves/gpt2-xl_lima
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lgaalves/gpt2-xl_lima
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lgaalves/gpt2-xl_lima](https://huggingface.co/lgaalves/gpt2-xl_lima) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lgaalves__gpt2-xl_lima_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-15T03:46:31.104311](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt2-xl_lima_public/blob/main/results_2023-11-15T03-46-31.104311.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2579848503192349,
"acc_stderr": 0.030758432385023834,
"acc_norm": 0.25961199994409145,
"acc_norm_stderr": 0.03153372055003476,
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023507,
"mc2": 0.3874325444900457,
"mc2_stderr": 0.014089660369122726,
"em": 0.002726510067114094,
"em_stderr": 0.0005340111700415908,
"f1": 0.04890100671140956,
"f1_stderr": 0.0013085576550093093
},
"harness|arc:challenge|25": {
"acc": 0.2645051194539249,
"acc_stderr": 0.012889272949313368,
"acc_norm": 0.31143344709897613,
"acc_norm_stderr": 0.013532472099850949
},
"harness|hellaswag|10": {
"acc": 0.39842660824536946,
"acc_stderr": 0.004885735963346903,
"acc_norm": 0.5128460466042621,
"acc_norm_stderr": 0.004988134303021793
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.03712537833614865,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.03712537833614865
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2236842105263158,
"acc_stderr": 0.033911609343436025,
"acc_norm": 0.2236842105263158,
"acc_norm_stderr": 0.033911609343436025
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.30566037735849055,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.30566037735849055,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.037455547914624576,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.037455547914624576
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3352601156069364,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.3352601156069364,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.13725490196078433,
"acc_stderr": 0.03424084669891522,
"acc_norm": 0.13725490196078433,
"acc_norm_stderr": 0.03424084669891522
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.251063829787234,
"acc_stderr": 0.028346963777162445,
"acc_norm": 0.251063829787234,
"acc_norm_stderr": 0.028346963777162445
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.03752833958003336,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.03752833958003336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20105820105820105,
"acc_stderr": 0.020641810782370165,
"acc_norm": 0.20105820105820105,
"acc_norm_stderr": 0.020641810782370165
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.041049472699033945,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.041049472699033945
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.20967741935483872,
"acc_stderr": 0.02315787934908352,
"acc_norm": 0.20967741935483872,
"acc_norm_stderr": 0.02315787934908352
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15763546798029557,
"acc_stderr": 0.025639014131172408,
"acc_norm": 0.15763546798029557,
"acc_norm_stderr": 0.025639014131172408
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3151515151515151,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.3151515151515151,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3383838383838384,
"acc_stderr": 0.03371124142626302,
"acc_norm": 0.3383838383838384,
"acc_norm_stderr": 0.03371124142626302
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.03027690994517825,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.03027690994517825
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3564102564102564,
"acc_stderr": 0.024283140529467295,
"acc_norm": 0.3564102564102564,
"acc_norm_stderr": 0.024283140529467295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655078,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655078
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.02720537153827948,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.02720537153827948
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3522935779816514,
"acc_stderr": 0.020480568843999,
"acc_norm": 0.3522935779816514,
"acc_norm_stderr": 0.020480568843999
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.20098039215686275,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.20098039215686275,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20675105485232068,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.20675105485232068,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.14349775784753363,
"acc_stderr": 0.0235293712696182,
"acc_norm": 0.14349775784753363,
"acc_norm_stderr": 0.0235293712696182
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.15702479338842976,
"acc_stderr": 0.0332124484254713,
"acc_norm": 0.15702479338842976,
"acc_norm_stderr": 0.0332124484254713
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.27607361963190186,
"acc_stderr": 0.0351238528370505,
"acc_norm": 0.27607361963190186,
"acc_norm_stderr": 0.0351238528370505
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404565,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404565
},
"harness|hendrycksTest-management|5": {
"acc": 0.27184466019417475,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.27184466019417475,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23931623931623933,
"acc_stderr": 0.02795182680892433,
"acc_norm": 0.23931623931623933,
"acc_norm_stderr": 0.02795182680892433
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26181353767560667,
"acc_stderr": 0.015720838678445256,
"acc_norm": 0.26181353767560667,
"acc_norm_stderr": 0.015720838678445256
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351294,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351294
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2057877813504823,
"acc_stderr": 0.022961339906764244,
"acc_norm": 0.2057877813504823,
"acc_norm_stderr": 0.022961339906764244
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25617283950617287,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.25617283950617287,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.02512373922687241,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.02512373922687241
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24119947848761408,
"acc_stderr": 0.010926496102034956,
"acc_norm": 0.24119947848761408,
"acc_norm_stderr": 0.010926496102034956
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19852941176470587,
"acc_stderr": 0.024231013370541107,
"acc_norm": 0.19852941176470587,
"acc_norm_stderr": 0.024231013370541107
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2434640522875817,
"acc_stderr": 0.017362473762146634,
"acc_norm": 0.2434640522875817,
"acc_norm_stderr": 0.017362473762146634
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2163265306122449,
"acc_stderr": 0.02635891633490403,
"acc_norm": 0.2163265306122449,
"acc_norm_stderr": 0.02635891633490403
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.03076944496729602,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.03076944496729602
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-virology|5": {
"acc": 0.26506024096385544,
"acc_stderr": 0.03436024037944967,
"acc_norm": 0.26506024096385544,
"acc_norm_stderr": 0.03436024037944967
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023507,
"mc2": 0.3874325444900457,
"mc2_stderr": 0.014089660369122726
},
"harness|winogrande|5": {
"acc": 0.5722178374112076,
"acc_stderr": 0.013905134013839943
},
"harness|drop|3": {
"em": 0.002726510067114094,
"em_stderr": 0.0005340111700415908,
"f1": 0.04890100671140956,
"f1_stderr": 0.0013085576550093093
},
"harness|gsm8k|5": {
"acc": 0.009097801364670205,
"acc_stderr": 0.002615326510775673
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7065489888191223,
-0.8551594018936157,
0.31492990255355835,
0.215389683842659,
-0.19571293890476227,
-0.030969329178333282,
0.01432394701987505,
-0.25616148114204407,
0.5590105056762695,
-0.049186065793037415,
-0.473120778799057,
-0.6868364214897156,
-0.4749840497970581,
0.2266511619091... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Lichang-Chen/Alpaca | Lichang-Chen | 2023-11-15T03:49:41Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T03:49:41Z | 2023-11-15T03:49:41.000Z | 2023-11-15T03:49:41 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Medradome/Analucie | Medradome | 2023-11-15T04:04:54Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-15T04:04:54Z | 2023-11-15T04:02:54.000Z | 2023-11-15T04:02:54 | ---
license: apache-2.0
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
chiyuanhsiao/NoiseDetection_VCTK_MUSAN-Noise | chiyuanhsiao | 2023-11-15T04:18:27Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T04:18:27Z | 2023-11-15T04:15:14.000Z | 2023-11-15T04:15:14 | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: instruction
dtype: string
- name: label
dtype: string
splits:
- name: test
num_bytes: 3457037204.875
num_examples: 26865
download_size: 3392199913
dataset_size: 3457037204.875
---
# Dataset Card for "NoiseDetection_VCTK_MUSAN-Noise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.5072267651557922,
-0.3314189910888672,
0.17426595091819763,
0.39827558398246765,
-0.35523250699043274,
-0.05057842284440994,
0.2721986472606659,
-0.09743024408817291,
0.6423389911651611,
0.5905976891517639,
-1.081534743309021,
-0.7974920272827148,
-0.5951445698738098,
-0.373663693666458... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
chiyuanhsiao/NoiseDetection_VCTK_MUSAN-Speech | chiyuanhsiao | 2023-11-15T04:44:25Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T04:44:25Z | 2023-11-15T04:41:03.000Z | 2023-11-15T04:41:03 | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: instruction
dtype: string
- name: label
dtype: string
splits:
- name: test
num_bytes: 3457423150.875
num_examples: 26865
download_size: 3394048375
dataset_size: 3457423150.875
---
# Dataset Card for "NoiseDetection_VCTK_MUSAN-Speech"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.45492124557495117,
-0.42228055000305176,
-0.0015118122100830078,
0.3008537292480469,
-0.2815440595149994,
0.002330075018107891,
-0.00985647551715374,
-0.09493216872215271,
0.6177585124969482,
0.6030760407447815,
-1.1076884269714355,
-0.8120626211166382,
-0.6618102192878723,
-0.480045765... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
joshuasundance/govgis_nov2023 | joshuasundance | 2023-11-17T16:16:30Z | 0 | 4 | null | [
"size_categories:100K<n<1M",
"language:en",
"license:mit",
"gis",
"geospatial",
"doi:10.57967/hf/1368",
"region:us"
] | 2023-11-17T16:16:30Z | 2023-11-15T04:54:34.000Z | 2023-11-15T04:54:34 | ---
language:
- en
tags:
- gis
- geospatial
license: mit
size_categories:
- 100K<n<1M
---
# govgis_nov2023
🤖 This README was written by GPT-4. 🤖
`govgis_nov2023` is an extensive compilation of metadata, documenting geospatial data from known government servers as of November 15 2023. This should provide a rich resource for GIS analysis, research, and application development.
These datasets contain data from various Federal, State, County, and City ArcGIS Servers listed by Joseph Elfelt of [Mapping Support](https://mappingsupport.com). It serves as a unique snapshot capturing the state of these servers in November 2023.
This repo contains the [very messy] notebooks with the code used to compile the data and save it in parquet format.
## Overview
- Content: Includes three primary files: servers.parquet, services.parquet, and layers.parquet, offering detailed insights into numerous GIS servers and layers.
- Size and Scope: The dataset covers data from 1684 servers, detailing almost a million individual layers with extensive metadata including field information for feature layers, cell size for raster layers, etc.
- Format: Data is stored in Parquet format, facilitating efficient storage and quick access.
- Status: This is a static snapshot and not actively maintained like Joseph Elfelt’s ongoing listings. However, this foundation may evolve into a maintained index.
## Data Collection
- Tools & Libraries Used: Data was collected using the [`restgdf`](https://github.com/joshuasundance-swca/restgdf) library, designed for efficient and asynchronous interaction with ArcGIS servers.
- Process: The dataset was created by scraping information from a wide range of ArcGIS servers, focusing on capturing a comprehensive and detailed snapshot as of November 2023.
- Verification: While data integrity was a focus, the dataset was not subjected to extensive cleaning, preserving the raw and detailed nature of the information.
## Data Processing
- Data Cleaning: Minimal cleaning was conducted to maintain the dataset's comprehensive and raw nature, allowing users to filter and process data as needed.
- Data Transformation: Collected data was standardized and converted into Parquet format for ease of use and accessibility.
## Use Cases
The `govgis_nov2023` dataset can be utilized for:
- Educational and Research Purposes: A valuable resource for GIS students, educators, and researchers.
- Geospatial Data Analysis: Ideal for analysts and data scientists for conducting extensive geospatial analyses.
- GIS Application Development: Useful for developers in building or enhancing GIS-related applications.
- Language Model Integration: The dataset can be used to train or evaluate language models for generating descriptions or summaries of GIS data.
## Conclusion
- Creation: This dataset was created using the restgdf library, emphasizing the potential of open-source contributions in the GIS field.
- Data Source: The dataset comprises data from publicly accessible ArcGIS servers. The dataset creator has no affiliation with Joseph Elfelt, MappingSupport.com, or the servers' respective owners.
| [
-0.7008516788482666,
-0.6174903512001038,
0.49828040599823,
-0.05943799391388893,
-0.015028662979602814,
-0.1885104477405548,
-0.1350855678319931,
-0.37122485041618347,
0.2823873460292816,
0.9360508322715759,
-0.29501965641975403,
-0.8608736991882324,
-0.4114726781845093,
0.013274188153445... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
anonymous-ai4science/ProFunc-9K | anonymous-ai4science | 2023-11-15T05:08:44Z | 0 | 0 | null | [
"license:cc-by-nc-sa-4.0",
"region:us"
] | 2023-11-15T05:08:44Z | 2023-11-15T04:57:31.000Z | 2023-11-15T04:57:31 | ---
license: cc-by-nc-sa-4.0
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
chiyuanhsiao/NoiseSNRLevelPrediction_VCTK_MUSAN-Gaussian | chiyuanhsiao | 2023-11-15T05:09:30Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T05:09:30Z | 2023-11-15T05:06:47.000Z | 2023-11-15T05:06:47 | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: instruction
dtype: string
- name: label
dtype: string
splits:
- name: test
num_bytes: 3458150877.875
num_examples: 26865
download_size: 3434724026
dataset_size: 3458150877.875
---
# Dataset Card for "NoiseSNRLevelPrediction_VCTK_MUSAN-Gaussian"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.4534004330635071,
-0.32970312237739563,
0.12075190246105194,
0.5780010223388672,
-0.2951986789703369,
-0.10756032168865204,
0.062099259346723557,
-0.033422667533159256,
0.6012808680534363,
0.34505003690719604,
-1.0583901405334473,
-0.9791831374168396,
-0.6760311126708984,
-0.42363181710... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Aharneish/sharded_data | Aharneish | 2023-11-15T05:29:50Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T05:29:50Z | 2023-11-15T05:29:50.000Z | 2023-11-15T05:29:50 | Entry not found | [
-0.32276472449302673,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965679168701,
0.7915717363357544,
0.07618629932403564,
0.7746022939682007,
0.2563222646713257,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
chiyuanhsiao/NoiseSNRLevelPrediction_VCTK_MUSAN-Music | chiyuanhsiao | 2023-11-15T05:35:08Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T05:35:08Z | 2023-11-15T05:31:51.000Z | 2023-11-15T05:31:51 | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: instruction
dtype: string
- name: label
dtype: string
splits:
- name: test
num_bytes: 3457963730.875
num_examples: 26865
download_size: 3434234141
dataset_size: 3457963730.875
---
# Dataset Card for "NoiseSNRLevelPrediction_VCTK_MUSAN-Music"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.5013785362243652,
-0.08917583525180817,
-0.039425987750291824,
0.5944698452949524,
-0.2549002468585968,
-0.05024593323469162,
-0.027792036533355713,
-0.005359757225960493,
0.6324729919433594,
0.48618608713150024,
-1.1868205070495605,
-1.0405501127243042,
-0.6053799390792847,
-0.28173735... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
sgjwong/ltedi23-models | sgjwong | 2023-11-15T06:12:16Z | 0 | 0 | null | [
"license:cc-by-4.0",
"region:us"
] | 2023-11-15T06:12:16Z | 2023-11-15T05:39:23.000Z | 2023-11-15T05:39:23 | ---
license: cc-by-4.0
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Pablao0948/Jax | Pablao0948 | 2023-11-15T05:41:51Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-15T05:41:51Z | 2023-11-15T05:41:29.000Z | 2023-11-15T05:41:29 | ---
license: openrail
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
hanruijiang/civitai-stable-diffusion-2.5m | hanruijiang | 2023-11-15T06:23:47Z | 0 | 2 | null | [
"task_categories:text-generation",
"task_categories:text-to-image",
"size_categories:1M<n<10M",
"language:en",
"license:apache-2.0",
"art",
"region:us"
] | 2023-11-15T06:23:47Z | 2023-11-15T05:54:10.000Z | 2023-11-15T05:54:10 | ---
license: apache-2.0
task_categories:
- text-generation
- text-to-image
language:
- en
tags:
- art
size_categories:
- 1M<n<10M
---
inspired by thefcraft/civitai-stable-diffusion-337k.
collected using civitai api to get all prompts. | [
-0.5345818996429443,
-0.47399282455444336,
0.6823691129684448,
0.639182448387146,
-0.35273414850234985,
0.24757981300354004,
0.2368621826171875,
-0.030383490025997162,
0.8955281972885132,
0.2648472189903259,
-1.2728326320648193,
-0.21808913350105286,
-0.010421684943139553,
0.22065752744674... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
chiyuanhsiao/NoiseSNRLevelPrediction_VCTK_MUSAN-Noise | chiyuanhsiao | 2023-11-15T06:00:25Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T06:00:25Z | 2023-11-15T05:57:54.000Z | 2023-11-15T05:57:54 | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: instruction
dtype: string
- name: label
dtype: string
splits:
- name: test
num_bytes: 3457873585.875
num_examples: 26865
download_size: 3426824990
dataset_size: 3457873585.875
---
# Dataset Card for "NoiseSNRLevelPrediction_VCTK_MUSAN-Noise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.44090700149536133,
-0.24817310273647308,
-0.016020474955439568,
0.5501145124435425,
-0.3030048906803131,
-0.07057663798332214,
0.13341154158115387,
-0.07731714099645615,
0.5821605324745178,
0.5003489851951599,
-1.158733606338501,
-0.9705172181129456,
-0.5795650482177734,
-0.281882256269... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
kkk0001/gd | kkk0001 | 2023-11-15T06:08:49Z | 0 | 0 | null | [
"task_categories:text-classification",
"language:en",
"license:mit",
"music",
"chemistry",
"region:us"
] | 2023-11-15T06:08:49Z | 2023-11-15T05:58:49.000Z | 2023-11-15T05:58:49 | ---
license: mit
task_categories:
- text-classification
language:
- en
tags:
- music
- chemistry
---
good | [
-0.1280050277709961,
-0.6574068069458008,
0.41326382756233215,
0.11075013875961304,
-0.32426658272743225,
0.05087730288505554,
0.22882764041423798,
-0.2980290651321411,
0.9535309672355652,
0.9962277412414551,
0.025649527087807655,
-0.21536999940872192,
-0.706927478313446,
-0.17874996364116... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
YuhoLiang/CVPR | YuhoLiang | 2023-11-15T06:11:22Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T06:11:22Z | 2023-11-15T06:11:22.000Z | 2023-11-15T06:11:22 | Entry not found | [
-0.32276472449302673,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965679168701,
0.7915717363357544,
0.07618629932403564,
0.7746022939682007,
0.2563222646713257,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
alexostrovskyy/sbic | alexostrovskyy | 2023-11-15T06:26:01Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-15T06:26:01Z | 2023-11-15T06:23:41.000Z | 2023-11-15T06:23:41 | ---
license: mit
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
xDAN-datasets/Huatuo26M_GPTShine_178k | xDAN-datasets | 2023-11-15T06:45:17Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T06:45:17Z | 2023-11-15T06:45:07.000Z | 2023-11-15T06:45:07 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 242537016
num_examples: 177703
download_size: 148588806
dataset_size: 242537016
---
# Dataset Card for "Huatuo26M_GPTShine_178k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.512930154800415,
-0.08564603328704834,
0.236163929104805,
0.42118048667907715,
-0.6282079219818115,
-0.11808699369430542,
0.1682775765657425,
-0.022941814735531807,
0.5331783890724182,
0.6132034659385681,
-0.7127799987792969,
-0.6428011059761047,
-0.5983366966247559,
-0.4069941937923431... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
chiyuanhsiao/NoiseDetection_VCTK-MUSAN-Gaussian | chiyuanhsiao | 2023-11-15T06:52:03Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T06:52:03Z | 2023-11-15T06:48:45.000Z | 2023-11-15T06:48:45 | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: instruction
dtype: string
- name: label
dtype: string
splits:
- name: test
num_bytes: 3457577090.875
num_examples: 26865
download_size: 3399676071
dataset_size: 3457577090.875
---
# Dataset Card for "NoiseDetection_VCTK-MUSAN-Gaussian"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.5293643474578857,
-0.43314817547798157,
0.29231488704681396,
0.4218285083770752,
-0.35260507464408875,
-0.08345688134431839,
0.21192318201065063,
-0.04885394498705864,
0.6434317827224731,
0.44997870922088623,
-1.0122313499450684,
-0.8106050491333008,
-0.6319280862808228,
-0.497990489006... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
karan23s/hf-codegen | karan23s | 2023-11-15T07:22:42Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T07:22:42Z | 2023-11-15T07:22:42.000Z | 2023-11-15T07:22:42 | Entry not found | [
-0.32276472449302673,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965679168701,
0.7915717363357544,
0.07618629932403564,
0.7746022939682007,
0.2563222646713257,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
guangyil/laion-coco-aesthetic | guangyil | 2023-11-15T10:34:11Z | 0 | 0 | null | [
"task_categories:image-to-text",
"task_categories:text-to-image",
"size_categories:1M<n<10M",
"language:en",
"license:apache-2.0",
"laion",
"region:us"
] | 2023-11-15T10:34:11Z | 2023-11-15T07:24:06.000Z | 2023-11-15T07:24:06 | ---
license: apache-2.0
task_categories:
- image-to-text
- text-to-image
language:
- en
tags:
- laion
pretty_name: s
size_categories:
- 1M<n<10M
---
# LAION COCO with aesthetic score and watermark score
This dataset contains 10% samples of the [LAION-COCO](https://huggingface.co/datasets/laion/laion-coco) dataset filtered by some text rules (*remove url, special tokens*, etc.), and image rules (*image size > 384x384*, *aesthetic score>4.75* and *watermark probability<0.5*). There are total 8,563,753 data instances in this dataset. And the corresponding [aesthetic score](https://github.com/christophschuhmann/improved-aesthetic-predictor/tree/main) and [watermark score](https://github.com/LAION-AI/LAION-5B-WatermarkDetection/tree/main) are also included.
Noted: watermark score in the table means the probability of the existence of the watermark in the image. | [
-0.5213411450386047,
-0.725775420665741,
0.39175888895988464,
0.32577231526374817,
-0.6276599764823914,
-0.02748810686171055,
0.11817587912082672,
-0.6459806561470032,
0.4481176435947418,
0.699235737323761,
-0.573643684387207,
-0.9117605090141296,
-0.44768399000167847,
-0.0642375499010086,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
xDAN-datasets/huatuo_encyclopedia_qa_364k | xDAN-datasets | 2023-11-15T07:28:59Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T07:28:59Z | 2023-11-15T07:28:12.000Z | 2023-11-15T07:28:12 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1196521698
num_examples: 364420
download_size: 0
dataset_size: 1196521698
---
# Dataset Card for "huatuo_encyclopedia_qa_364k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.5926772952079773,
-0.011407307349145412,
0.26215964555740356,
0.23051439225673676,
-0.508966326713562,
-0.12176342308521271,
0.35488367080688477,
-0.13817203044891357,
0.6681215763092041,
0.5126386880874634,
-0.6618771553039551,
-0.6859652996063232,
-0.14895914494991302,
-0.072555214166... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
cloutwhore/TeaLeaves | cloutwhore | 2023-11-15T07:31:56Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T07:31:56Z | 2023-11-15T07:31:56.000Z | 2023-11-15T07:31:56 | Entry not found | [
-0.32276487350463867,
-0.22568444907665253,
0.8622263073921204,
0.43461570143699646,
-0.5282988548278809,
0.7012969255447388,
0.7915717363357544,
0.07618642598390579,
0.7746027112007141,
0.25632190704345703,
-0.7852815389633179,
-0.22573848068714142,
-0.910447895526886,
0.5715675354003906,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
qblocks/no_robots_chatformatted_version1 | qblocks | 2023-11-15T07:47:42Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T07:47:42Z | 2023-11-15T07:47:41.000Z | 2023-11-15T07:47:41 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: category
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 29092450
num_examples: 9500
- name: test
num_bytes: 1560738
num_examples: 500
download_size: 18917122
dataset_size: 30653188
---
# Dataset Card for "no_robots_FalconChatFormated"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.614260733127594,
-0.31512507796287537,
0.058385513722896576,
0.08068373799324036,
-0.08163134753704071,
0.2605799734592438,
0.5690496563911438,
0.11357999593019485,
1.1977423429489136,
0.840144693851471,
-1.1540122032165527,
-0.8103565573692322,
-0.6050867438316345,
0.010288560763001442... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Zangs3011/no_robots_test | Zangs3011 | 2023-11-15T07:49:36Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T07:49:36Z | 2023-11-15T07:49:11.000Z | 2023-11-15T07:49:11 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: category
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 29092450
num_examples: 9500
- name: test
num_bytes: 1560738
num_examples: 500
download_size: 18917122
dataset_size: 30653188
---
# Dataset Card for "no_robots_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.6846439242362976,
-0.40003371238708496,
0.18084612488746643,
0.05718699470162392,
-0.07124125212430954,
-0.07169865816831589,
0.3767794072628021,
0.04390238597989082,
0.936781108379364,
0.6737331748008728,
-1.1111503839492798,
-0.8185978531837463,
-0.4268395006656647,
0.0657590553164482... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Zangs3011/no_robots_gpt2ChatFormated | Zangs3011 | 2023-11-15T07:50:01Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T07:50:01Z | 2023-11-15T07:49:59.000Z | 2023-11-15T07:49:59 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: category
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 29092450
num_examples: 9500
- name: test
num_bytes: 1560738
num_examples: 500
download_size: 18917122
dataset_size: 30653188
---
# Dataset Card for "no_robots_gpt2ChatFormated"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.3594464361667633,
-0.5036324858665466,
0.17528702318668365,
0.1627229005098343,
-0.22842098772525787,
0.11297106742858887,
0.3941490650177002,
-0.01994989439845085,
0.8387272357940674,
0.7624095678329468,
-0.9668995141983032,
-0.744918167591095,
-0.6461135745048523,
-0.15892784297466278... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
surabhiMV/qrcode_new_train | surabhiMV | 2023-11-15T08:08:38Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T08:08:38Z | 2023-11-15T08:08:34.000Z | 2023-11-15T08:08:34 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 13629030.0
num_examples: 352
download_size: 12896919
dataset_size: 13629030.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "qrcode_new_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.6407684683799744,
0.1390610933303833,
0.20248734951019287,
0.21487419307231903,
-0.36384183168411255,
0.0922967940568924,
0.30432236194610596,
0.04507320374250412,
0.7452042698860168,
0.3853291869163513,
-0.837995171546936,
-0.7798627614974976,
-0.49035167694091797,
-0.3764641582965851,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
xDAN-datasets/glaive_code_assistant_140K | xDAN-datasets | 2023-11-15T08:11:26Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T08:11:26Z | 2023-11-15T08:11:00.000Z | 2023-11-15T08:11:00 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 417459108
num_examples: 136109
download_size: 0
dataset_size: 417459108
---
# Dataset Card for "glaive_code_assistant_140K"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.20076194405555725,
-0.20446768403053284,
0.12057133764028549,
0.3649401068687439,
-0.16862037777900696,
-0.03867028281092644,
0.2855283319950104,
-0.1623438596725464,
0.7135393023490906,
0.5618712306022644,
-0.667271614074707,
-0.8302595019340515,
-0.6988054513931274,
-0.348321884870529... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_Undi95__Llama2-13B-no_robots-alpaca-lora_public | open-llm-leaderboard | 2023-11-15T08:18:56Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T08:18:56Z | 2023-11-15T08:18:10.000Z | 2023-11-15T08:18:10 | ---
pretty_name: Evaluation run of Undi95/Llama2-13B-no_robots-alpaca-lora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/Llama2-13B-no_robots-alpaca-lora](https://huggingface.co/Undi95/Llama2-13B-no_robots-alpaca-lora)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__Llama2-13B-no_robots-alpaca-lora_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-15T08:15:04.836039](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Llama2-13B-no_robots-alpaca-lora_public/blob/main/results_2023-11-15T08-15-04.836039.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5288556443369928,\n\
\ \"acc_stderr\": 0.03390383953418472,\n \"acc_norm\": 0.5370018287535696,\n\
\ \"acc_norm_stderr\": 0.034712721572579625,\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.4045559753787184,\n\
\ \"mc2_stderr\": 0.01423646056016957,\n \"em\": 0.031774328859060404,\n\
\ \"em_stderr\": 0.0017962473521312278,\n \"f1\": 0.09261220637583845,\n\
\ \"f1_stderr\": 0.0021550523797604715\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5418088737201365,\n \"acc_stderr\": 0.014560220308714695,\n\
\ \"acc_norm\": 0.5887372013651877,\n \"acc_norm_stderr\": 0.014379441068522082\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6309500099581756,\n\
\ \"acc_stderr\": 0.004815613144385403,\n \"acc_norm\": 0.8243377813184625,\n\
\ \"acc_norm_stderr\": 0.003797548252851636\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5660377358490566,\n \"acc_stderr\": 0.030503292013342592,\n\
\ \"acc_norm\": 0.5660377358490566,\n \"acc_norm_stderr\": 0.030503292013342592\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.49710982658959535,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.49710982658959535,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374768,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374768\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3492063492063492,\n \"acc_stderr\": 0.024552292209342654,\n \"\
acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.024552292209342654\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.039325376803928704,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.039325376803928704\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6258064516129033,\n\
\ \"acc_stderr\": 0.027528904299845704,\n \"acc_norm\": 0.6258064516129033,\n\
\ \"acc_norm_stderr\": 0.027528904299845704\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.037818873532059816,\n\
\ \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.037818873532059816\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6464646464646465,\n \"acc_stderr\": 0.03406086723547155,\n \"\
acc_norm\": 0.6464646464646465,\n \"acc_norm_stderr\": 0.03406086723547155\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845454,\n\
\ \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845454\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5205128205128206,\n \"acc_stderr\": 0.02532966316348994,\n \
\ \"acc_norm\": 0.5205128205128206,\n \"acc_norm_stderr\": 0.02532966316348994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.032252942323996406,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.032252942323996406\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6844036697247706,\n \"acc_stderr\": 0.019926117513869666,\n \"\
acc_norm\": 0.6844036697247706,\n \"acc_norm_stderr\": 0.019926117513869666\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896079,\n \"\
acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896079\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7205882352941176,\n \"acc_stderr\": 0.031493281045079556,\n \"\
acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.031493281045079556\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.041032038305145124,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.041032038305145124\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.0471282125742677,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.0471282125742677\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404544,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404544\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7350427350427351,\n\
\ \"acc_stderr\": 0.028911208802749486,\n \"acc_norm\": 0.7350427350427351,\n\
\ \"acc_norm_stderr\": 0.028911208802749486\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7266922094508301,\n\
\ \"acc_stderr\": 0.015936681062628556,\n \"acc_norm\": 0.7266922094508301,\n\
\ \"acc_norm_stderr\": 0.015936681062628556\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.615606936416185,\n \"acc_stderr\": 0.026189666966272035,\n\
\ \"acc_norm\": 0.615606936416185,\n \"acc_norm_stderr\": 0.026189666966272035\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26927374301675977,\n\
\ \"acc_stderr\": 0.014835616582882611,\n \"acc_norm\": 0.26927374301675977,\n\
\ \"acc_norm_stderr\": 0.014835616582882611\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.02845263998508801,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.02845263998508801\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\
\ \"acc_stderr\": 0.027648149599751468,\n \"acc_norm\": 0.6141479099678456,\n\
\ \"acc_norm_stderr\": 0.027648149599751468\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6172839506172839,\n \"acc_stderr\": 0.02704453813840261,\n\
\ \"acc_norm\": 0.6172839506172839,\n \"acc_norm_stderr\": 0.02704453813840261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41843971631205673,\n \"acc_stderr\": 0.02942799403941999,\n \
\ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.02942799403941999\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41134289439374183,\n\
\ \"acc_stderr\": 0.012567882673803685,\n \"acc_norm\": 0.41134289439374183,\n\
\ \"acc_norm_stderr\": 0.012567882673803685\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4338235294117647,\n \"acc_stderr\": 0.030105636570016633,\n\
\ \"acc_norm\": 0.4338235294117647,\n \"acc_norm_stderr\": 0.030105636570016633\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5294117647058824,\n \"acc_stderr\": 0.02019280827143379,\n \
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.02019280827143379\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.031130880396235933,\n\
\ \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.031130880396235933\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.030965903123573033,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.030965903123573033\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.4045559753787184,\n\
\ \"mc2_stderr\": 0.01423646056016957\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7529597474348856,\n \"acc_stderr\": 0.012121402942855575\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.031774328859060404,\n \
\ \"em_stderr\": 0.0017962473521312278,\n \"f1\": 0.09261220637583845,\n\
\ \"f1_stderr\": 0.0021550523797604715\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.06444275966641395,\n \"acc_stderr\": 0.006763391728488274\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Undi95/Llama2-13B-no_robots-alpaca-lora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|arc:challenge|25_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|drop|3_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|gsm8k|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hellaswag|10_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T08-15-04.836039.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-15T08-15-04.836039.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- '**/details_harness|winogrande|5_2023-11-15T08-15-04.836039.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-15T08-15-04.836039.parquet'
- config_name: results
data_files:
- split: 2023_11_15T08_15_04.836039
path:
- results_2023-11-15T08-15-04.836039.parquet
- split: latest
path:
- results_2023-11-15T08-15-04.836039.parquet
---
# Dataset Card for Evaluation run of Undi95/Llama2-13B-no_robots-alpaca-lora
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/Llama2-13B-no_robots-alpaca-lora
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/Llama2-13B-no_robots-alpaca-lora](https://huggingface.co/Undi95/Llama2-13B-no_robots-alpaca-lora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__Llama2-13B-no_robots-alpaca-lora_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-15T08:15:04.836039](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Llama2-13B-no_robots-alpaca-lora_public/blob/main/results_2023-11-15T08-15-04.836039.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5288556443369928,
"acc_stderr": 0.03390383953418472,
"acc_norm": 0.5370018287535696,
"acc_norm_stderr": 0.034712721572579625,
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.4045559753787184,
"mc2_stderr": 0.01423646056016957,
"em": 0.031774328859060404,
"em_stderr": 0.0017962473521312278,
"f1": 0.09261220637583845,
"f1_stderr": 0.0021550523797604715
},
"harness|arc:challenge|25": {
"acc": 0.5418088737201365,
"acc_stderr": 0.014560220308714695,
"acc_norm": 0.5887372013651877,
"acc_norm_stderr": 0.014379441068522082
},
"harness|hellaswag|10": {
"acc": 0.6309500099581756,
"acc_stderr": 0.004815613144385403,
"acc_norm": 0.8243377813184625,
"acc_norm_stderr": 0.003797548252851636
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5660377358490566,
"acc_stderr": 0.030503292013342592,
"acc_norm": 0.5660377358490566,
"acc_norm_stderr": 0.030503292013342592
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5625,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374768,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374768
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.024552292209342654,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.024552292209342654
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.039325376803928704,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.039325376803928704
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6258064516129033,
"acc_stderr": 0.027528904299845704,
"acc_norm": 0.6258064516129033,
"acc_norm_stderr": 0.027528904299845704
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6242424242424243,
"acc_stderr": 0.037818873532059816,
"acc_norm": 0.6242424242424243,
"acc_norm_stderr": 0.037818873532059816
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6464646464646465,
"acc_stderr": 0.03406086723547155,
"acc_norm": 0.6464646464646465,
"acc_norm_stderr": 0.03406086723547155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.030975436386845454,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.030975436386845454
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5205128205128206,
"acc_stderr": 0.02532966316348994,
"acc_norm": 0.5205128205128206,
"acc_norm_stderr": 0.02532966316348994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.032252942323996406,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.032252942323996406
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6844036697247706,
"acc_stderr": 0.019926117513869666,
"acc_norm": 0.6844036697247706,
"acc_norm_stderr": 0.019926117513869666
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896079,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896079
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.041032038305145124,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.041032038305145124
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.0471282125742677,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.0471282125742677
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404544,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404544
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7350427350427351,
"acc_stderr": 0.028911208802749486,
"acc_norm": 0.7350427350427351,
"acc_norm_stderr": 0.028911208802749486
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7266922094508301,
"acc_stderr": 0.015936681062628556,
"acc_norm": 0.7266922094508301,
"acc_norm_stderr": 0.015936681062628556
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.615606936416185,
"acc_stderr": 0.026189666966272035,
"acc_norm": 0.615606936416185,
"acc_norm_stderr": 0.026189666966272035
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26927374301675977,
"acc_stderr": 0.014835616582882611,
"acc_norm": 0.26927374301675977,
"acc_norm_stderr": 0.014835616582882611
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.02845263998508801,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.02845263998508801
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751468,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751468
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6172839506172839,
"acc_stderr": 0.02704453813840261,
"acc_norm": 0.6172839506172839,
"acc_norm_stderr": 0.02704453813840261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.02942799403941999,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.02942799403941999
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41134289439374183,
"acc_stderr": 0.012567882673803685,
"acc_norm": 0.41134289439374183,
"acc_norm_stderr": 0.012567882673803685
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4338235294117647,
"acc_stderr": 0.030105636570016633,
"acc_norm": 0.4338235294117647,
"acc_norm_stderr": 0.030105636570016633
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.02019280827143379,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.02019280827143379
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6163265306122448,
"acc_stderr": 0.031130880396235933,
"acc_norm": 0.6163265306122448,
"acc_norm_stderr": 0.031130880396235933
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.030965903123573033,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.030965903123573033
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.4045559753787184,
"mc2_stderr": 0.01423646056016957
},
"harness|winogrande|5": {
"acc": 0.7529597474348856,
"acc_stderr": 0.012121402942855575
},
"harness|drop|3": {
"em": 0.031774328859060404,
"em_stderr": 0.0017962473521312278,
"f1": 0.09261220637583845,
"f1_stderr": 0.0021550523797604715
},
"harness|gsm8k|5": {
"acc": 0.06444275966641395,
"acc_stderr": 0.006763391728488274
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.723548948764801,
-0.8562325835227966,
0.28353866934776306,
0.22426776587963104,
-0.20197822153568268,
-0.046694282442331314,
0.022215085104107857,
-0.24475011229515076,
0.6084713935852051,
-0.002193424617871642,
-0.49952852725982666,
-0.7016879320144653,
-0.44075247645378113,
0.25068914... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
AudioDecBenchmark/esc50 | AudioDecBenchmark | 2023-11-15T09:53:06Z | 0 | 0 | null | [
"license:cc-by-nc-4.0",
"region:us"
] | 2023-11-15T09:53:06Z | 2023-11-15T08:19:52.000Z | 2023-11-15T08:19:52 | ---
license: cc-by-nc-4.0
configs:
- config_name: default
data_files:
- split: original
path: data/original-*
- split: truncated
path: data/truncated-*
dataset_info:
features:
- name: id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: original
num_bytes: 882129224.0
num_examples: 2000
- name: truncated
num_bytes: 290015142.0
num_examples: 2000
download_size: 282072258
dataset_size: 1172144366.0
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
stiffmeister923/Building_Computer_Guide | stiffmeister923 | 2023-11-17T20:42:03Z | 0 | 0 | null | [
"license:ecl-2.0",
"region:us"
] | 2023-11-17T20:42:03Z | 2023-11-15T08:51:37.000Z | 2023-11-15T08:51:37 | ---
license: ecl-2.0
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
lllchenlll/COCO_ARC | lllchenlll | 2023-11-17T02:10:10Z | 0 | 1 | null | [
"arxiv:2311.08172",
"region:us"
] | 2023-11-17T02:10:10Z | 2023-11-15T09:41:23.000Z | 2023-11-15T09:41:23 | # Vision-Language Instruction Tuning: A Review and Analysis
---
**Chen Li<sup>1</sup>, Yixiao Ge<sup>1</sup>, Dian Li<sup>2</sup>, and Ying Shan<sup>1</sup>.**
**<sup>1</sup>ARC Lab, Tencent PCG**<br>
**<sup>2</sup>Foundation Technology Center, Tencent PCG**
<p align="center">
<img src="https://i.imgur.com/waxVImv.png" alt="Oryx Video-ChatGPT">
</p>
<a href='https://huggingface.co/datasets/lllchenlll/COCO_ARC'><img src='https://img.shields.io/badge/Data-Huggingface-ebc634'></a>
<a href='https://creativecommons.org/licenses/by/4.0/'><img src='https://img.shields.io/badge/License-CC%20BY--SA%204.0-eb9334'></a>
<a href='https://arxiv.org/abs/2311.08172'><img src='https://img.shields.io/badge/Paper-ArXiv-eb4c34'></a>
This paper is a review of all the works related to vision-language instruction tuning (VLIT). We will periodically update the recent public VLIT dataset and the VLIT data constructed by the pipeline in this paper.
---
## 📆 Schedule
- [ ] Release New Vision-Language Instruction Data (periodically) ...
- [ ] Update Public VLIT Datasets and Related Work (periodically) ...
- [ ] Release Construction Tools
- [x] [2023.11.16] Release Instruction Data
- [x] [2023.11.15] Paper Released ([ArXiv](https://arxiv.org/abs/2311.08172))
## 🏷️ Catalogue
1. <a href="#label_evd">Existing VLIT Data</a>
2. <a href="#label_vdctp">VLIT Data Constructed in This Paper</a>
<span id="label_evd"> </span>
## 🗒️ Existing VLIT Dataset
Currently, the existing VLIT generation schemes can be divided into two categories, among which Annotation Adaption mainly relies on directly adjusting and rewriting the existing annotation data to adapt to the VLIT data template. Self-Instruct relies on the Large Language Model (LLM) to synthesize annotation data from more sources and reorganize it to generate VLIT data with more diversity and complexity (of course, it also brings more noise and hallucination).
```
VLIT Data
├─ General Instruction
│ ├─ Annotation Adaption
│ └─ Self-Instruct
├─ Specific Instruction
│ ├─ Object/Task-Specific
│ │ ├─ Region
│ │ ├─ Video
│ │ └─ Text
│ └─ Domain-Specific
│ ├─ Medicine
│ ├─ Document
│ └─ PointCloud
├─ Construction Tools
└─ Data Mixing
```
### Dataset
| Dataset | MLLM | Paper |
| :--- | :--- | :---|
| ... | ... | ... |
| LVIS-INSTRUCT4V | - | [To See is to Believe: Prompting GPT-4V for Better Visual Instruction Tuning](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/To%20See%20is%20to%20Believe-%20Prompting%20GPT-4V%20for%20Better%20Visual%20Instruction%20Tuning.pdf) |
| [GranD](https://github.com/mbzuai-oryx/groundingLMM#-grounding-anything-dataset-grand) | [GLaMM](https://github.com/mbzuai-oryx/groundingLMM) | [GLaMM: Pixel Grounding Large Multimodal Model](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/GLaMM%20-%20Pixel%20Grounding%20Large%20Multimodal%20Model.pdf) |
| ComVint | - | [What Makes for Good Visual Instructions? Synthesizing Complex Visual Reasoning Instructions for Visual Instruction Tuning](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/What%20Makes%20for%20Good%20Visual%20Instructions%3F%20Synthesizing%20Complex%20Visual%20Reasoning%20Instructions%20for%20Visual%20Instruction%20Tuning.pdf) |
| [MiniGPT-v2](https://github.com/Vision-CAIR/MiniGPT-4/blob/main/MiniGPTv2_Train.md) | [MiniGPT-v2](https://github.com/Vision-CAIR/MiniGPT-4/tree/main) | [MiniGPT-v2: Large Language Model As a Unified Interface for Vision-Language Multi-task Learning](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/MiniGPT-v2-%20Large%20Language%20Model%20As%20a%20Unified%20Interface%20for%20Vision-Language%20Multi-task%20Learning.pdf) |
| GRIT | [Ferret](https://github.com/apple/ml-ferret) | [FERRET REFER AND GROUND ANYTHING ANYWHERE AT ANY GRANULARITY](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/FERRET%20REFER%20AND%20GROUND%20ANYTHING%20ANYWHERE%20AT%20ANY%20GRANULARITY.pdf) |
| [SparklesDialogue-VG](https://github.com/HYPJUDY/Sparkles#data-sparklesdialogue) | [SparklesChat](https://github.com/HYPJUDY/Sparkles) | [Sparkles: Unlocking Chats Across Multiple Images for Multimodal Instruction-Following Models](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/Sparkles-%20Unlocking%20Chats%20Across%20Multiple%20Images%20for%20Multimodal%20Instruction-Following%20Models.pdf) |
| [SparklesDialogue-CC](https://github.com/HYPJUDY/Sparkles#data-sparklesdialogue) | [SparklesChat](https://github.com/HYPJUDY/Sparkles) | [Sparkles: Unlocking Chats Across Multiple Images for Multimodal Instruction-Following Models](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/Sparkles-%20Unlocking%20Chats%20Across%20Multiple%20Images%20for%20Multimodal%20Instruction-Following%20Models.pdf) |
| InternLM-XComposer | [InternLM-XComposer](https://github.com/InternLM/InternLM-XComposer) | [InternLM-XComposer: A Vision-Language Large Model for Advanced Text-image Comprehension and Composition](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/InternLM-XComposer-%20A%20Vision-Language%20Large%20Model%20for%20Advanced%20Text-image%20Comprehension%20and%20Composition.pdf) |
| AnyMAL | AnyMAL | [AnyMAL: An Efficient and Scalable Any-Modality Augmented Language Model](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/AnyMAL-%20An%20Efficient%20and%20Scalable%20Any-Modality%20Augmented%20Language%20Model.pdf) |
| DreamLLM | [DreamLLM](https://github.com/RunpeiDong/DreamLLM) | [DREAMLLM: SYNERGISTIC MULTIMODAL COMPREHENSION AND CREATION](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/DREAMLLM-%20SYNERGISTIC%20MULTIMODAL%20COMPREHENSION%20AND%20CREATION.pdf) |
| [TextBind](https://github.com/SihengLi99/TextBind#31-data-preparation) | [TextBind](https://github.com/SihengLi99/TextBind) | [TEXTBIND: Multi-turn Interleaved Multimodal Instruction-following in the Wild](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/TEXTBIND-%20Multi-turn%20Interleaved%20Multimodal%20Instruction-following%20in%20the%20Wild.pdf) |
| [PVIT](https://huggingface.co/PVIT) | [PVIT](https://github.com/PVIT-official/PVIT) | [Position-Enhanced Visual Instruction Tuning for Multimodal Large Language Models](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/Position-Enhanced%20Visual%20Instruction%20Tuning%20for%20Multimodal%20Large%20Language%20Models.pdf) |
| T2M | [NExT-GPT](https://github.com/NExT-GPT/NExT-GPT) | [NExT-GPT: Any-to-Any Multimodal LLM](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/NExT-GPT-%20Any-to-Any%20Multimodal%20LLM.pdf) |
| MosIT | [NExT-GPT](https://github.com/NExT-GPT/NExT-GPT) | [NExT-GPT: Any-to-Any Multimodal LLM](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/NExT-GPT-%20Any-to-Any%20Multimodal%20LLM.pdf) |
| [GPTVQA](https://opendatalab.com/OpenDataLab/DataEngine-InstData) | [MLLM-DataEngine](https://github.com/opendatalab/MLLM-DataEngine) | [MLLM-DataEngine: An Iterative Refinement Approach for MLLM](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/MLLM-DataEngine-%20An%20Iterative%20Refinement%20Approach%20for%20MLLM.pdf) |
| CIEM | - | [CIEM: Contrastive Instruction Evaluation Method for Better Instruction Tuning](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/CIEM-%20Contrastive%20Instruction%20Evaluation%20Method%20for%20Better%20Instruction%20Tuning.pdf) |
| [PointLLM](https://huggingface.co/datasets/RunsenXu/PointLLM/tree/main) | [PointLLM](https://github.com/OpenRobotLab/PointLLM) | [PointLLM: Empowering Large Language Models to Understand Point Clouds](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/PointLLM-%20Empowering%20Large%20Language%20Models%20to%20Understand%20Point%20Clouds.pdf) |
| [VIGC](https://opendatalab.com/OpenDataLab/VIGC-InstData) | [VIGC](https://github.com/opendatalab/VIGC) | [VIGC: Visual Instruction Generation and Correction](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/VIGC-%20Visual%20Instruction%20Generation%20and%20Correction.pdf) |
| M-HalDetec | - | [Detecting and Preventing Hallucinations in Large Vision Language Models](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/Detecting%20and%20Preventing%20Hallucinations%20in%20Large%20Vision%20Language%20Models.pdf) |
| [StableLLaVA](https://github.com/icoz69/StableLLAVA#pipeline) | [StableLLaVA](https://github.com/icoz69/StableLLAVA) | [StableLLaVA: Enhanced Visual Instruction Tuning with Synthesized Image-Dialogue Data](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/StableLLaVA-%20Enhanced%20Visual%20Instruction%20Tuning%20with%20Synthesized%20Image-Dialogue%20Data.pdf) |
| [I4](https://github.com/DCDmllm/Cheetah/tree/main/I4%20Benchmark) | [Cheetor](https://github.com/DCDmllm/Cheetah) | [EMPOWERING VISION-LANGUAGE MODELS TO FOLLOW INTERLEAVED VISION-LANGUAGE INSTRUCTIONS](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/EMPOWERING%20VISION-LANGUAGE%20MODELS%20TO%20FOLLOW%20INTERLEAVED%20VISION-LANGUAGE%20INSTRUCTIONS.pdf) |
| [AS-1B](https://huggingface.co/spaces/OpenGVLab/all-seeing) | [ASM](https://github.com/OpenGVLab/All-Seeing) | [The All-Seeing Project: Towards Panoptic Visual Recognition and Understanding of the Open World](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/The%20All-Seeing%20Project-%20Towards%20Panoptic%20Visual%20Recognition%20and%20Understanding%20of%20the%20Open%20World.pdf) |
| [Multimodal_id_v1](https://huggingface.co/datasets/YunxinLi/Multimodal_Instruction_data_v1) | [LMEye(IPN)](https://github.com/YunxinLi/LingCloud) | [LMEye: An Interactive Perception Network for Large Language Models](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/LMEye-%20An%20Interactive%20Perception%20Network%20for%20Large%20Language%20Models.pdf) |
| [Lynx](https://github.com/bytedance/lynx-llm#prepare-data) | [Lynx](https://github.com/bytedance/lynx-llm) | [What Matters in Training a GPT4-Style Language Model with Multimodal Inputs?](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/What%20Matters%20in%20Training%20a%20GPT4-Style%20Language%20Model%20with%20Multimodal%20Inputs%3F.pdf) |
| MGVLID | ChatSpot | [ChatSpot: Bootstrapping Multimodal LLMs via Precise Referring Instruction Tuning](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/ChatSpot-%20Bootstrapping%20Multimodal%20LLMs%20via%20Precise%20Referring%20Instruction%20Tuning.pdf) |
| [BuboGPT](https://huggingface.co/datasets/magicr/BuboGPT/tree/main) | [BuboGPT](https://github.com/magic-research/bubogpt) | [BuboGPT: Enabling Visual Grounding in Multi-Modal LLMs](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/BuboGPT-%20Enabling%20Visual%20Grounding%20in%20Multi-Modal%20LLMs.pdf) |
| [GRIT-20M](https://huggingface.co/datasets/zzliang/GRIT) | [KOSMOS-2](https://github.com/microsoft/unilm/tree/master/kosmos-2) | [KOSMOS-2: Grounding Multimodal Large Language Models to the World](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/KOSMOS-2-%20Grounding%20Multimodal%20Large%20Language%20Models%20to%20the%20World.pdf) |
| [SVIT](https://huggingface.co/datasets/BAAI/SVIT) | [SVIT(MMLLM)](https://github.com/BAAI-DCAI/Visual-Instruction-Tuning) | [SVIT: Scaling up Visual Instruction Tuning](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/SVIT-%20Scaling%20up%20Visual%20Instruction%20Tuning.pdf) |
| [GPT4RoI](https://github.com/jshilong/GPT4RoI#data) | [GPT4RoI](https://github.com/jshilong/GPT4RoI) | [GPT4RoI: Instruction Tuning Large Language Model on Region-of-Interest](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/GPT4RoI-%20Instruction%20Tuning%20Large%20Language%20Model%20on%20Region-of-Interest.pdf) |
| [PF-1M](https://huggingface.co/datasets/chendelong/PF-1M) | [Clever Flamingo](https://github.com/ChenDelong1999/polite-flamingo) | [Visual Instruction Tuning with Polite Flamingo](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/Visual%20Instruction%20Tuning%20with%20Polite%20Flamingo.pdf) |
| [Shikra-RD](https://github.com/shikras/shikra/blob/main/docs/data.md) | [Shikra](https://github.com/shikras/shikra) | [Shikra: Unleashing Multimodal LLM’s Referential Dialogue Magic](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/Shikra-%20Unleashing%20Multimodal%20LLM%E2%80%99s%20Referential%20Dialogue%20Magic.pdf) |
| [LLaVAR](https://huggingface.co/datasets/SALT-NLP/LLaVAR) | [LLaVAR](https://github.com/SALT-NLP/LLaVAR) | [LLaVAR: Enhanced Visual Instruction Tuning for Text-Rich Image Understanding](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/LLaVAR-%20Enhanced%20Visual%20Instruction%20Tuning%20for%20Text-Rich%20Image%20Understanding.pdf) |
| OphGLM | [OphGLM](https://github.com/ML-AILab/OphGLM) | [OphGLM: Training an Ophthalmology Large Language-and-Vision Assistant based on Instructions and Dialogue](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/OphGLM-%20Training%20an%20Ophthalmology%20Large%20Language-and-Vision%20Assistant%20based%20on%20Instructions%20and%20Dialogue.pdf) |
| [LAMM](https://opendatalab.com/LAMM/download) | [LAMM](https://github.com/OpenGVLab/LAMM) | [LAMM: Language-Assisted Multi-Modal Instruction-Tuning Dataset, Framework, and Benchmark](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/LAMM-%20Language-Assisted%20Multi-Modal%20Instruction-Tuning%20Dataset%2C%20Framework%2C%20and%20Benchmark.pdf) |
| [MACAW-LLM](https://github.com/lyuchenyang/Macaw-LLM#usage-) | [MACAW-LLM](https://github.com/lyuchenyang/Macaw-LLM) | [Macaw-LLM: Multi-Modal Language Modeling with Image, Audio, Video, and Text Integration](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/Macaw-LLM-%20Multi-Modal%20Language%20Modeling%20with%20Image%2C%20Audio%2C%20Video%2C%20and%20Text%20Integration.pdf) |
| InstructBLIP | [InstructBLIP](https://github.com/salesforce/LAVIS/tree/main/projects/instructblip) | [InstructBLIP: Towards General-purpose Vision-Language Models with Instruction Tuning](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/InstructBLIP-%20Towards%20General-purpose%20Vision-Language%20Models%20with%20Instruction%20Tuning.pdf) |
| [MultiModal-GPT](https://github.com/open-mmlab/Multimodal-GPT#prepare-datasets) | [MultiModal-GPT](https://github.com/open-mmlab/Multimodal-GPT) | [MultiModal-GPT: A Vision and Language Model for Dialogue with Humans](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/MultiModal-GPT-%20A%20Vision%20and%20Language%20Model%20for%20Dialogue%20with%20Humans.pdf) |
| [Valley-Instruct-73](https://huggingface.co/datasets/luoruipu1/Valley-Instruct-73k) | [VALLEY](https://github.com/RupertLuo/Valley) | [VALLEY: VIDEO ASSISTANT WITH LARGE LANGUAGE MODEL ENHANCED ABILITY](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/VALLEY-%20VIDEO%20ASSISTANT%20WITH%20LARGE%20LANGUAGE%20MODEL%20ENHANCED%20ABILITY.pdf) |
| [Video-LLaMA](https://github.com/DAMO-NLP-SG/Video-LLaMA#data) | [Video-LLaMA](https://github.com/DAMO-NLP-SG/Video-LLaMA) | [Video-LLaMA: An Instruction-tuned Audio-Visual Language Model for Video Understanding](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/Video-LLaMA-%20An%20Instruction-tuned%20Audio-Visual%20Language%20Model%20for%20Video%20Understanding.pdf) |
| [MULTIINSTRUCT](https://github.com/VT-NLP/MultiInstruct#usage) | [OFA(multiinstruct)](https://github.com/VT-NLP/MultiInstruct) | [MULTIINSTRUCT: Improving Multi-Modal Zero-Shot Learning via Instruction Tuning](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/MULTIINSTRUCT-%20Improving%20Multi-Modal%20Zero-Shot%20Learning%20via%20Instruction%20Tuning.pdf) |
| [Video-ChatGPT](https://github.com/mbzuai-oryx/Video-ChatGPT#video-instruction-dataset-open_file_folder) | [Video-ChatGPT](https://github.com/mbzuai-oryx/Video-ChatGPT) | [Video-ChatGPT: Towards Detailed Video Understanding via Large Vision and Language Models](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/Video-ChatGPT-%20Towards%20Detailed%20Video%20Understanding%20via%20Large%20Vision%20and%20Language%20Models.pdf) |
| [MIMIC-IT](https://github.com/Luodian/Otter/blob/main/mimic-it/README.md) | [Otter](https://github.com/Luodian/Otter) | [MIMIC-IT: Multi-Modal In-Context Instruction Tuning](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/MIMIC-IT-%20Multi-Modal%20In-Context%20Instruction%20Tuning.pdf) |
| [M3IT](https://huggingface.co/datasets/MMInstruction/M3IT) | Ying-VLM | [M3IT: A Large-Scale Dataset towards Multi-Modal Multilingual Instruction Tuning](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/M3IT-%20A%20Large-Scale%20Dataset%20towards%20Multi-Modal%20Multilingual%20Instruction%20Tuning.pdf) |
| [GPT4Tools](https://github.com/AILab-CVC/GPT4Tools#dataset) | [GPT4Tools](https://github.com/AILab-CVC/GPT4Tools) | [GPT4Tools: Teaching Large Language Model to Use Tools via Self-instruction](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/GPT4Tools-%20Teaching%20Large%20Language%20Model%20to%20Use%20Tools%20via%20Self-instruction.pdf) |
| [PMC-VQA](https://huggingface.co/datasets/xmcmic/PMC-VQA) | [MedVInT-TE/TD](https://github.com/xiaoman-zhang/PMC-VQA) | [PMC-VQA: Visual Instruction Tuning for Medical Visual Question Answering](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/PMC-VQA-%20Visual%20Instruction%20Tuning%20for%20Medical%20Visual%20Question%20Answering.pdf) |
| [pandagpt_vid](https://huggingface.co/datasets/openllmplayground/pandagpt_visual_instruction_dataset) | [PandaGPT](https://github.com/yxuansu/PandaGPT) | [PandaGPT: One Model To Instruction-Follow Them All](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/PandaGPT-%20One%20Model%20To%20Instruction-Follow%20Them%20All.pdf) |
| [MULTIS](https://github.com/joez17/ChatBridge/blob/main/custom_datasets/valor_data/DATASET.md#second-stage-dataset-preparation) | [ChatBridge](https://github.com/joez17/ChatBridge) | [ChatBridge: Bridging Modalities with Large Language Model as a Language Catalyst](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/ChatBridge-%20Bridging%20Modalities%20with%20Large%20Language%20Model%20as%20a%20Language%20Catalyst.pdf) |
| [DetGPT](https://github.com/OptimalScale/DetGPT/blob/main/dataset/download.sh) | [DetGPT](https://github.com/OptimalScale/DetGPT) | [DetGPT: Detect What You Need via Reasoning](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/DetGPT-%20Detect%20What%20You%20Need%20via%20Reasoning.pdf) |
| [X-LLM](https://github.com/phellonchen/X-LLM#dataset) | [X-LLM](https://github.com/phellonchen/X-LLM) | [X-LLM: Bootstrapping Advanced Large Language Models by Treating Multi-Modalities as Foreign Languages](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/X-LLM-%20Bootstrapping%20Advanced%20Large%20Language%20Models%20by%20Treating%20Multi-Modalities%20as%20Foreign%20Languages.pdf) |
| [VideoChat](https://github.com/OpenGVLab/InternVideo/tree/main/Data/instruction_data) | [VideoChat](https://github.com/OpenGVLab/Ask-Anything) | [Videochat: Chat-centric video understanding](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/VideoChat%20-%20Chat-Centric%20Video%20Understanding.pdf) |
| [cc_sbu_align](https://huggingface.co/datasets/Vision-CAIR/cc_sbu_align) | [Mini-GPT4](https://github.com/Vision-CAIR/MiniGPT-4) | [Minigpt-4: Enhancing vision-language understanding with advanced large language models](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/Minigpt-4-%20Enhancing%20vision-language%20understanding%20with%20advanced%20large%20language%20models.pdf) |
| [LLaVA_instruct_150K](https://huggingface.co/datasets/liuhaotian/LLaVA-Instruct-150K) | [LLaVA](https://github.com/haotian-liu/LLaVA) | [Visual Instruction Tuning](https://github.com/palchenli/VL-Instruction-Tuning/blob/main/assert/paper/Visual%20Instruction%20Tuning.pdf) |
<span id="label_vdctp"> </span>
## 📂 Multi-Modal Instruction Data Constructed in This Paper
In this paper, we propose a vision-language instruction construction pipeline and generate a corresponding dataset.
Specifically, the generated instruction data is a multi-round question answering about a given image.
Here are some examples of the generated instruction data:
### Examples
| Image | Conversation |
| :--- | :--- |
| <img src="https://github.com/palchenli/VL-Instruction-Tuning/raw/main/assert/images/COCO_train2014_000000288586.jpg" width="200%"/> | User: Can you describe the scene in the image?<br>AI: Sure, the image captures a large grizzly bear wading through a moving river. The bear has a brown coat and appears to be quite large and furry. It's currently not holding anything in its mouth and it doesn't seem to have caught any fish from the river.<br>User: Are grizzly bears typically found in such environments?<br>AI: Yes, indeed. Grizzly bears are often found near rivers or streams. They are excellent swimmers and often go into the water in search of food. During salmon spawning seasons, they are known to frequent river areas to catch the fish.<br>User: Is the bear possibly looking for food?<br>AI: That's a strong possibility. While there are no fish present in the river in the image, grizzly bears are known to forage for various food sources in rivers, including fish and other aquatic organisms. However, without further context, it's hard to say definitively.<br>User: What other characteristics does a grizzly bear have?<br>AI: Grizzly bears, which are a subspecies of brown bear, are known for their size, strength, and excellent swimming ability. They have a distinct hump over their shoulders which is a muscle mass used for digging and clawing. Their fur color can vary from very light tan to dark brown, with the fur on their back often being darker than the rest of their body. Also, despite their large size, grizzlies are quite fast and can run at speeds of up to 30 miles per hour. |
| <img src="https://github.com/palchenli/VL-Instruction-Tuning/raw/main/assert/images/COCO_train2014_000000030722.jpg" width="200%"/> | User: What is the main animal in the picture?<br>AI: The main animal in the picture is a giraffe.<br>User: What is the giraffe doing?<br>AI: The giraffe is standing next to a tree and some rocks. It's not doing much, just hanging around.<br>User: Is the giraffe standing on the rocks?<br>AI: No, the giraffe is not standing on the rocks. It is standing next to them.<br>User: Can you describe the setting in more detail?<br>AI: The picture mainly focuses on the giraffe, a tree, and some rocks. The giraffe is standing right next to the tree. There are rocks scattered around them.<br>User: Are there any elephants in the picture?<br>AI: No, there are no elephants in this picture. Only a single giraffe is present.<br>User: Are there any unique features of the giraffe?<br>AI: As per the details provided in the image, the giraffe is standing and has a spotted pattern on its skin, which is a characteristic feature of giraffes.<br>User: Is the giraffe sitting on the rocks?<br>AI: No, the giraffe is not sitting on the rocks. It's standing by them.<br>User: Is the giraffe eating the tree leaves?<br>AI: There's no indication in the image that the giraffe is eating the leaves of the tree. It's simply standing beside it.<br>User: Where do giraffes usually live?<br>AI: Giraffes are native to Africa and they usually inhabit savannas, grasslands, and open woodlands. However, in this particular image, the exact location isn't specified. |
| <img src="https://github.com/palchenli/VL-Instruction-Tuning/raw/main/assert/images/COCO_train2014_000000247884_0.jpg" width="200%"/> | User: Is there a car in the picture?<br>AI: Yes, there is a car in the picture and it appears to be parked.<br>User: What can you tell me about the car?<br>AI: Based on the image, the car is parked, which suggests that it is not currently in use. However, I can't provide specifics about its make, model, or color without more visual information.<br>User: Do you think the car is related to the boy on the skateboard?<br>AI: It's hard to definitively state a connection between the car and the boy on the skateboard based on the image alone. The car may belong to someone who is watching the boy, or it may just be parked in the vicinity where the boy is skateboarding.<br>User: Is the car moving?<br>AI: No, the car isn't moving. It's parked.<br>User: Could the car potentially pose a danger to the skateboarder?<br>AI: Since the car is parked, it's unlikely to pose a direct threat to the skateboarder. However, any vehicle, parked or moving, in the vicinity of skateboarders could potentially become a hazard, especially if the skateboarder loses control. This emphasizes the importance of skateboarding in safe, designated areas.<br>User: Is the car possibly there for a skateboarding event?<br>AI: Based on the information available, I can't definitely say if the car is there for a skateboarding event. It could belong to a spectator or participant, or it could be unrelated to the skateboarding activity. More context would be needed to make a clear conclusion. |
There are three different types of instruction data. The data statistics and download links are as follows.
### Download Links
| Data Type | Baidu CLoud | Google Drive | Huggingface |
| :--- | :---: | :---: | :---: |
| COCO_2014_Images | [url](https://cocodataset.org/) | [url](https://cocodataset.org/) | [url](https://cocodataset.org/) |
| Global | [url](https://pan.baidu.com/s/15Ge_lwge-YOxL55_0roOfA?pwd=inok) | [url](https://drive.usercontent.google.com/download?id=1rEzH0RhWqjq8W6zXc-t8Q3Tg3ncB1dpN&export=download&authuser=0&confirm=t&uuid=f574c321-ad4c-438e-94a6-8790db70c58f&at=APZUnTVglRBUCUC6tax-d3OH33Io:1700050876759) | [url](https://huggingface.co/datasets/lllchenlll/COCO_ARC/resolve/main/global.json?download=true) |
| Negative | [url](https://pan.baidu.com/s/1wuCkm443ufpG3-xcHVrRNA?pwd=auc7) | [url](https://drive.usercontent.google.com/download?id=1sQurFP7M_Ftd2Q5NSZm41_PCMT4ECd0g&export=download&authuser=0&confirm=t&uuid=fb82922c-0fd0-4b47-a5f1-af70f4d1b300&at=APZUnTUOOoYjM2gAhK79wsUkKUFk:1700051467871) | [url](https://huggingface.co/datasets/lllchenlll/COCO_ARC/resolve/main/negative.json?download=true) |
| Region | [url](https://pan.baidu.com/s/15m1RMpeirEz83Jsxd8zC0w?pwd=96p5) | [url](https://drive.usercontent.google.com/download?id=1Qbk4cOfTcrsPx7k1rD0E20hTkdhYNfBU&export=download&authuser=0&confirm=t&uuid=6fa256d3-e085-4089-9073-11799a7b3b74&at=APZUnTXdeLntbNQeEWgpD7SvulsM:1700051759650) | [url](https://huggingface.co/datasets/lllchenlll/COCO_ARC/resolve/main/region.json?download=true) |
| Region_Images | [url](https://pan.baidu.com/s/1NpggqYSLTjcTSlohLcXKLA?pwd=mhgo) | [url](https://drive.usercontent.google.com/download?id=1FMsU3sZLXDtumrNJK6CXgOd_YMIWXKaf&export=download&authuser=0&confirm=t&uuid=4eecfe1f-9807-478b-b501-54330c3713f4&at=APZUnTUCjk0W087kFmx5TECtBHRc:1700139356648) | [url](https://huggingface.co/datasets/lllchenlll/COCO_ARC/resolve/main/region_images.zip?download=true) |
### Data Format
```json
{
"image_source": "",
"construction_time": "",
"annotations": [
{
"img_ids": "",
"instruction_type": "",
"conversations": []
},
{
"img_ids": "",
"instruction_type": "",
"conversations": []
}
]
}
```
## 📎 Citation
If you found this repository useful, please consider citing:
```
@article{li2023visionlanguage,
title={Vision-Language Instruction Tuning: A Review and Analysis},
author={Chen Li and Yixiao Ge and Dian Li and Ying Shan},
year={2023},
eprint={2311.08172},
archivePrefix={arXiv},
primaryClass={cs.MM}
}
```
## 👍🏻 Acknowledgement
We would like to thank [LLaVA](https://github.com/haotian-liu/LLaVA), [LAVIS](https://github.com/salesforce/LAVIS) and [OpenFlamingo](https://github.com/mlfoundations/open_flamingo) for their well-architcated multi-modal LLMs.
Thanks to [SEED-Bench](https://github.com/AILab-CVC/SEED-Bench) for being an open source and convenient benchmark for evaluating MLLMs. | [
-0.5239128470420837,
-0.780125617980957,
0.3489055633544922,
0.2152758687734604,
-0.1381908506155014,
-0.18643076717853546,
-0.3618106245994568,
-0.3477337658405304,
-0.16342955827713013,
0.5318589806556702,
-0.5713006258010864,
-0.8822371363639832,
-0.26092275977134705,
-0.266931742429733... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
xDAN-datasets/ultrachat_uncensored_500K | xDAN-datasets | 2023-11-15T09:48:39Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T09:48:39Z | 2023-11-15T09:45:24.000Z | 2023-11-15T09:45:24 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: input
sequence: string
- name: output
sequence: string
splits:
- name: train
num_bytes: 2890399190
num_examples: 269808
download_size: 1511652642
dataset_size: 2890399190
---
# Dataset Card for "ultrachat_uncensored_500K"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.4375920593738556,
-0.16263358294963837,
-0.013545719906687737,
0.2601725459098816,
-0.5655815005302429,
0.16253671050071716,
0.1411774605512619,
-0.25245410203933716,
0.8808465600013733,
0.8106746673583984,
-0.6341184377670288,
-0.8940858244895935,
-0.4378567039966583,
-0.23547968268394... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
NovelSense/syntra-experiment-dataset | NovelSense | 2023-11-15T11:34:45Z | 0 | 1 | null | [
"task_categories:object-detection",
"size_categories:1K<n<10K",
"license:cc-by-sa-4.0",
"traffic",
"vehicles",
"car",
"synthetic",
"mobility",
"infrastructure",
"doi:10.57967/hf/1350",
"region:us"
] | 2023-11-15T11:34:45Z | 2023-11-15T09:48:35.000Z | 2023-11-15T09:48:35 | ---
license: cc-by-sa-4.0
task_categories:
- object-detection
tags:
- traffic
- vehicles
- car
- synthetic
- mobility
- infrastructure
pretty_name: SYNTRA Experiment Dataset
size_categories:
- 1K<n<10K
---
# About
This is the *SYNTRA Experiment Dataset*. It is a sample dataset from the NovelSense SYNTRA EU Hubs 4 Data experiment (https://euhubs4data.eu/experiments/syntra/). The experiment supported the development of a web application reachable under https://syntra.app. The dataset is a synthetic traffic infrastructure dataset e.g. for use for the validation, trainig and optimization of your traffic AI models.
# Datset description
The dataset has been created by generating 14 different visualization configurations. These include color spectrum of cars, camera noise, background, driving trajectories, among others. The dataset consists of png and xml files.
Each png files has a corresponding xml file which contians the annotation information in PascalVOC format.
The structure of the png and xml file names is a follows:
XXXXXX-C-M_frame_F.(png|xml)
* XXXXXX -- string encoding of configuration
* C -- number of the configuration
* M -- video in this configuration
* F -- frame number in this video
# Limitation
The dataset was generated using a development version of SYNTRA and contains only cars.
# License
SYNTRA Experiment Dataset © 2023 by NovelSense UG is licensed under CC BY-SA 4.0
(https://creativecommons.org/licenses/by-sa/4.0/) | [
-0.48670172691345215,
-0.29542165994644165,
0.44899505376815796,
0.26629534363746643,
-0.35674700140953064,
0.09834297746419907,
-0.08685338497161865,
-0.36850592494010925,
0.23795773088932037,
0.2397209107875824,
-0.7228309512138367,
-0.4036223590373993,
-0.10443466156721115,
0.1056997254... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
feedback-to-code/Server_Text_Dataset_1 | feedback-to-code | 2023-11-15T09:51:02Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-15T09:51:02Z | 2023-11-15T09:50:23.000Z | 2023-11-15T09:50:23 | ---
license: apache-2.0
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Jim-Maar-Uni/Server_Test_Dataset | Jim-Maar-Uni | 2023-11-15T09:53:46Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-15T09:53:46Z | 2023-11-15T09:53:19.000Z | 2023-11-15T09:53:19 | ---
license: apache-2.0
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
makedelta/analyst_report_raw | makedelta | 2023-11-15T09:54:42Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T09:54:42Z | 2023-11-15T09:54:28.000Z | 2023-11-15T09:54:28 | ---
dataset_info:
features:
- name: content
dtype: string
- name: created_at
dtype: string
- name: pdf_content
dtype: string
- name: sentiment
dtype: float64
- name: summary1
dtype: string
- name: summary2
dtype: string
- name: title
dtype: string
- name: updated_at
dtype: string
splits:
- name: train
num_bytes: 151542728
num_examples: 5847
download_size: 77730208
dataset_size: 151542728
---
# Dataset Card for "analyst_report_raw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.37871652841567993,
-0.18331092596054077,
0.1402086466550827,
0.2528402805328369,
-0.0772259309887886,
0.21176066994667053,
0.26233214139938354,
-0.4033410847187042,
0.8026214838027954,
0.2961996793746948,
-0.7260176539421082,
-0.8000938892364502,
-0.41613975167274475,
-0.226834043860435... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Prals1938/practicequestion | Prals1938 | 2023-11-15T10:24:31Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-15T10:24:31Z | 2023-11-15T10:24:31.000Z | 2023-11-15T10:24:31 | ---
license: apache-2.0
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_itsliupeng__llama2_7b_zh_public | open-llm-leaderboard | 2023-11-15T10:55:15Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T10:55:15Z | 2023-11-15T10:54:30.000Z | 2023-11-15T10:54:30 | ---
pretty_name: Evaluation run of itsliupeng/llama2_7b_zh
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [itsliupeng/llama2_7b_zh](https://huggingface.co/itsliupeng/llama2_7b_zh) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_itsliupeng__llama2_7b_zh_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-15T10:51:37.128756](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__llama2_7b_zh_public/blob/main/results_2023-11-15T10-51-37.128756.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5969511263414031,\n\
\ \"acc_stderr\": 0.0329865461490785,\n \"acc_norm\": 0.6078135521201408,\n\
\ \"acc_norm_stderr\": 0.03376504385445851,\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326912,\n \"mc2\": 0.42858587749612026,\n\
\ \"mc2_stderr\": 0.014059235435250938,\n \"em\": 0.18791946308724833,\n\
\ \"em_stderr\": 0.004000599568072892,\n \"f1\": 0.23667890100671124,\n\
\ \"f1_stderr\": 0.003992615682814011\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.47952218430034127,\n \"acc_stderr\": 0.01459913135303501,\n\
\ \"acc_norm\": 0.5204778156996587,\n \"acc_norm_stderr\": 0.01459913135303501\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5608444532961562,\n\
\ \"acc_stderr\": 0.004952698802275648,\n \"acc_norm\": 0.7487552280422227,\n\
\ \"acc_norm_stderr\": 0.004328425700998689\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.03761070869867479,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.03761070869867479\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.029224526469124792,\n\
\ \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.029224526469124792\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082634,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082634\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.03252909619613197,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.03252909619613197\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.044629175353369355,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.044629175353369355\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137285,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137285\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7064516129032258,\n\
\ \"acc_stderr\": 0.025906087021319295,\n \"acc_norm\": 0.7064516129032258,\n\
\ \"acc_norm_stderr\": 0.025906087021319295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139744,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139744\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5871794871794872,\n \"acc_stderr\": 0.024962683564331796,\n\
\ \"acc_norm\": 0.5871794871794872,\n \"acc_norm_stderr\": 0.024962683564331796\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0303883535518868,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0303883535518868\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8091743119266055,\n \"acc_stderr\": 0.01684767640009109,\n \"\
acc_norm\": 0.8091743119266055,\n \"acc_norm_stderr\": 0.01684767640009109\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7303921568627451,\n \"acc_stderr\": 0.031145570659486782,\n \"\
acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.031145570659486782\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677697,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677697\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.03559039531617342,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.03559039531617342\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489298,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489298\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n\
\ \"acc_stderr\": 0.014987270640946012,\n \"acc_norm\": 0.7726692209450831,\n\
\ \"acc_norm_stderr\": 0.014987270640946012\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.025070713719153176,\n\
\ \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.025070713719153176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3217877094972067,\n\
\ \"acc_stderr\": 0.015624236160792582,\n \"acc_norm\": 0.3217877094972067,\n\
\ \"acc_norm_stderr\": 0.015624236160792582\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.02705797462449438,\n\
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.02705797462449438\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464496,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464496\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.02646248777700187,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.02646248777700187\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.455019556714472,\n\
\ \"acc_stderr\": 0.012718456618701763,\n \"acc_norm\": 0.455019556714472,\n\
\ \"acc_norm_stderr\": 0.012718456618701763\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6160130718954249,\n \"acc_stderr\": 0.01967580813528151,\n \
\ \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.01967580813528151\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533207,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533207\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326912,\n \"mc2\": 0.42858587749612026,\n\
\ \"mc2_stderr\": 0.014059235435250938\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7174427782162589,\n \"acc_stderr\": 0.01265406285097139\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.18791946308724833,\n \
\ \"em_stderr\": 0.004000599568072892,\n \"f1\": 0.23667890100671124,\n\
\ \"f1_stderr\": 0.003992615682814011\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.06444275966641395,\n \"acc_stderr\": 0.006763391728488265\n\
\ }\n}\n```"
repo_url: https://huggingface.co/itsliupeng/llama2_7b_zh
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|arc:challenge|25_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|drop|3_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|gsm8k|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hellaswag|10_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|winogrande|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-15T10-51-37.128756.parquet'
- config_name: results
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- results_2023-11-15T10-51-37.128756.parquet
- split: latest
path:
- results_2023-11-15T10-51-37.128756.parquet
---
# Dataset Card for Evaluation run of itsliupeng/llama2_7b_zh
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/itsliupeng/llama2_7b_zh
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [itsliupeng/llama2_7b_zh](https://huggingface.co/itsliupeng/llama2_7b_zh) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_itsliupeng__llama2_7b_zh_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-15T10:51:37.128756](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__llama2_7b_zh_public/blob/main/results_2023-11-15T10-51-37.128756.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5969511263414031,
"acc_stderr": 0.0329865461490785,
"acc_norm": 0.6078135521201408,
"acc_norm_stderr": 0.03376504385445851,
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326912,
"mc2": 0.42858587749612026,
"mc2_stderr": 0.014059235435250938,
"em": 0.18791946308724833,
"em_stderr": 0.004000599568072892,
"f1": 0.23667890100671124,
"f1_stderr": 0.003992615682814011
},
"harness|arc:challenge|25": {
"acc": 0.47952218430034127,
"acc_stderr": 0.01459913135303501,
"acc_norm": 0.5204778156996587,
"acc_norm_stderr": 0.01459913135303501
},
"harness|hellaswag|10": {
"acc": 0.5608444532961562,
"acc_stderr": 0.004952698802275648,
"acc_norm": 0.7487552280422227,
"acc_norm_stderr": 0.004328425700998689
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.03761070869867479,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.03761070869867479
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.029224526469124792,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.029224526469124792
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082634,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082634
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.03252909619613197,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.03252909619613197
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.044629175353369355,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.044629175353369355
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137285,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137285
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7064516129032258,
"acc_stderr": 0.025906087021319295,
"acc_norm": 0.7064516129032258,
"acc_norm_stderr": 0.025906087021319295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.02649905770139744,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.02649905770139744
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5871794871794872,
"acc_stderr": 0.024962683564331796,
"acc_norm": 0.5871794871794872,
"acc_norm_stderr": 0.024962683564331796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.0303883535518868,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.0303883535518868
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8091743119266055,
"acc_stderr": 0.01684767640009109,
"acc_norm": 0.8091743119266055,
"acc_norm_stderr": 0.01684767640009109
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7303921568627451,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.7303921568627451,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677697,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677697
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489298,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489298
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.014987270640946012,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.014987270640946012
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.025070713719153176,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.025070713719153176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3217877094972067,
"acc_stderr": 0.015624236160792582,
"acc_norm": 0.3217877094972067,
"acc_norm_stderr": 0.015624236160792582
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.02705797462449438,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.02705797462449438
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464496,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464496
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.02646248777700187,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.02646248777700187
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.455019556714472,
"acc_stderr": 0.012718456618701763,
"acc_norm": 0.455019556714472,
"acc_norm_stderr": 0.012718456618701763
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.01967580813528151,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.01967580813528151
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533207,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533207
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326912,
"mc2": 0.42858587749612026,
"mc2_stderr": 0.014059235435250938
},
"harness|winogrande|5": {
"acc": 0.7174427782162589,
"acc_stderr": 0.01265406285097139
},
"harness|drop|3": {
"em": 0.18791946308724833,
"em_stderr": 0.004000599568072892,
"f1": 0.23667890100671124,
"f1_stderr": 0.003992615682814011
},
"harness|gsm8k|5": {
"acc": 0.06444275966641395,
"acc_stderr": 0.006763391728488265
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7147862911224365,
-0.8196132183074951,
0.30508220195770264,
0.2541169822216034,
-0.20310327410697937,
-0.04504813253879547,
0.0033359909430146217,
-0.22313708066940308,
0.5802753567695618,
-0.024805089458823204,
-0.47601035237312317,
-0.703087568283081,
-0.45345014333724976,
0.243715286... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
finleyhu/cat | finleyhu | 2023-11-15T10:55:26Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-15T10:55:26Z | 2023-11-15T10:54:46.000Z | 2023-11-15T10:54:46 | ---
license: apache-2.0
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
lawinsider/uk_ner_contracts_spacy | lawinsider | 2023-11-15T12:35:01Z | 0 | 0 | null | [
"task_categories:text-classification",
"language:uk",
"legal",
"region:us"
] | 2023-11-15T12:35:01Z | 2023-11-15T11:49:30.000Z | 2023-11-15T11:49:30 | ---
task_categories:
- text-classification
language:
- uk
tags:
- legal
pretty_name: uk NER contracts
---
### Dataset Description
Legal Contracts Dataset for Training SpaCy NER Model
This repository contains a specially curated dataset consisting of legal contracts. It is designed for the purpose of training a Named Entity Recognition (NER) model using SpaCy, with the aim to recognize and classify four types of entities in the text:
Contract Type,
Clause Title,
Clause Number,
Definition Title
The dataset includes a broad variety of legal contracts, covering diverse domains such as employment, real estate, services, sale, lease, etc.
Entities in the text have been manually labeled by experts in the field, ensuring high-quality training data for the model.
Each document in the dataset has been annotated in the following format:
(Start_Position, End_Position, Entity_Label)
For example, a clause title may be annotated as follows: (102, 115, 'clause title')
This will assist the NER model in identifying not only the text of the entity, but also its position within the document.
Usage Guidelines
The dataset can be loaded into a SpaCy pipeline for training a NER model. For more information on how to train a NER model using SpaCy, please refer to the SpaCy documentation.
| [
-0.3272746503353119,
-0.20843948423862457,
0.2940029203891754,
-0.18066485226154327,
-0.29984521865844727,
-0.22180481255054474,
0.12970420718193054,
-0.2938942313194275,
0.13568870723247528,
1.3982659578323364,
-0.1909702867269516,
-0.8425632119178772,
-0.7025293707847595,
0.2639006376266... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Sei-Johan/Splat | Sei-Johan | 2023-11-15T11:50:47Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T11:50:47Z | 2023-11-15T11:50:47.000Z | 2023-11-15T11:50:47 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622264862060547,
0.43461528420448303,
-0.52829909324646,
0.7012971639633179,
0.7915720343589783,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104477167129517,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
librarian-bots/arxiv_articles_new_dataset_with_citation_info_and_hub_info | librarian-bots | 2023-11-15T12:02:27Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T12:02:27Z | 2023-11-15T12:02:23.000Z | 2023-11-15T12:02:23 | ---
dataset_info:
features:
- name: id
dtype: string
- name: submitter
dtype: string
- name: authors
dtype: string
- name: title
dtype: string
- name: comments
dtype: string
- name: journal-ref
dtype: string
- name: doi
dtype: string
- name: report-no
dtype: string
- name: categories
dtype: string
- name: license
dtype: string
- name: abstract
dtype: string
- name: versions
list:
- name: version
dtype: string
- name: created
dtype: string
- name: update_date
dtype: timestamp[s]
- name: authors_parsed
sequence:
sequence: string
- name: prediction
dtype: string
- name: probability
dtype: float64
- name: citationCount
dtype: int64
- name: is_cited_on_hub
dtype: bool
splits:
- name: train
num_bytes: 70298541
num_examples: 42202
download_size: 39466135
dataset_size: 70298541
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "arxiv_articles_new_dataset_with_citation_info_and_hub_info"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.5829439759254456,
-0.4329889118671417,
0.30618658661842346,
0.24910968542099,
-0.1680542379617691,
0.05151134729385376,
0.43081721663475037,
-0.0675697773694992,
0.7744187712669373,
0.4104597866535187,
-0.38510316610336304,
-0.7647745013237,
-0.5501962304115295,
-0.04424724355340004,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
faizalnf1800/white-shirt-and-lightblue-necktie | faizalnf1800 | 2023-11-15T14:13:31Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T14:13:31Z | 2023-11-15T12:20:08.000Z | 2023-11-15T12:20:08 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622264862060547,
0.43461528420448303,
-0.52829909324646,
0.7012971639633179,
0.7915720343589783,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104477167129517,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
TanvirOnHF/g_Dorks | TanvirOnHF | 2023-11-15T22:48:05Z | 0 | 0 | null | [
"size_categories:10K<n<100K",
"language:en",
"license:cc-by-4.0",
"List",
"Google",
"GHDB",
"Dork",
"NSFW",
"region:us"
] | 2023-11-15T22:48:05Z | 2023-11-15T12:28:15.000Z | 2023-11-15T12:28:15 | ---
license: cc-by-4.0
pretty_name: Google Dorks
tags:
- List
- Google
- GHDB
- Dork
- NSFW
language:
- en
size_categories:
- 10K<n<100K
---
# g_Dorks [TXT dataset]
A dataset comprising a collection of popular Google dorks acquired from various and numerous origins.
## Data Source
***Secret!***
## Disclaimer
Please note that while I strive to maintain data quality, I cannot guarantee the accuracy or quality of all entries in this dataset. Use it responsibly and exercise caution when relying on the data for any critical applications. Your feedback and contributions are greatly appreciated for improving the dataset's overall quality.
| [
-0.15022499859333038,
-0.8136981129646301,
0.406172513961792,
-0.3362917900085449,
-0.005506208632141352,
0.5878032445907593,
-0.000548003998119384,
-0.12516000866889954,
0.3894526958465576,
0.5589589476585388,
-0.9838457703590393,
-0.32415571808815,
-0.858224630355835,
-0.0302435196936130... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
bogdansinik/embeddings | bogdansinik | 2023-11-15T12:32:02Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-15T12:32:02Z | 2023-11-15T12:30:36.000Z | 2023-11-15T12:30:36 | ---
license: mit
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
wangxinze/verilog_prepare | wangxinze | 2023-11-15T13:03:27Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-15T13:03:27Z | 2023-11-15T12:57:09.000Z | 2023-11-15T12:57:09 | ---
license: apache-2.0
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
rokset3/slim_pajama_627b | rokset3 | 2023-11-15T13:00:00Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T13:00:00Z | 2023-11-15T13:00:00.000Z | 2023-11-15T13:00:00 | Entry not found | [
-0.32276472449302673,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965679168701,
0.7915717363357544,
0.07618629932403564,
0.7746022939682007,
0.2563222646713257,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
erikhenriksson/register_label_dataset_en | erikhenriksson | 2023-11-15T13:07:42Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T13:07:42Z | 2023-11-15T13:07:12.000Z | 2023-11-15T13:07:12 | ---
dataset_info:
features:
- name: chatgpt
struct:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 313808339
num_examples: 33915
download_size: 153206385
dataset_size: 313808339
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "register_label_dataset_en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.6239872574806213,
-0.05206839367747307,
0.12646713852882385,
0.19653840363025665,
-0.16978497803211212,
0.019851362332701683,
0.30457451939582825,
-0.2604672610759735,
0.8250133395195007,
0.3605251908302307,
-0.6889259815216064,
-0.952251672744751,
-0.6084628701210022,
-0.16860029101371... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
xDAN2099/Midjourney-messages-collection | xDAN2099 | 2023-11-15T13:37:10Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T13:37:10Z | 2023-11-15T13:08:01.000Z | 2023-11-15T13:08:01 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: channel_id
dtype: string
- name: content
dtype: string
- name: timestamp
dtype: string
- name: image_id
dtype: string
- name: height
dtype: int64
- name: width
dtype: int64
- name: url
dtype: string
- name: size
dtype: int64
splits:
- name: train
num_bytes: 28872193514
num_examples: 55082563
download_size: 0
dataset_size: 28872193514
---
# Dataset Card for "Midjourney-messages-collection"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.4884849786758423,
-0.06767985969781876,
0.31616199016571045,
0.4783931076526642,
-0.43289506435394287,
0.05481722205877304,
0.20588970184326172,
-0.08999904990196228,
1.0068416595458984,
0.4749454855918884,
-1.2832326889038086,
-0.650797426700592,
-0.7343312501907349,
-0.387829840183258... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Fabrice-TOK/Test | Fabrice-TOK | 2023-11-15T13:28:59Z | 0 | 0 | null | [
"region:us"
] | 2023-11-15T13:28:59Z | 2023-11-15T13:28:59.000Z | 2023-11-15T13:28:59 | Entry not found | [
-0.32276472449302673,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965679168701,
0.7915717363357544,
0.07618629932403564,
0.7746022939682007,
0.2563222646713257,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Kaue123456/PatatiZeLuiz | Kaue123456 | 2023-11-15T13:34:44Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-15T13:34:44Z | 2023-11-15T13:33:10.000Z | 2023-11-15T13:33:10 | ---
license: openrail
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.