id stringlengths 2 115 | author stringlengths 2 42 ⌀ | last_modified timestamp[us, tz=UTC] | downloads int64 0 8.87M | likes int64 0 3.84k | paperswithcode_id stringlengths 2 45 ⌀ | tags list | lastModified timestamp[us, tz=UTC] | createdAt stringlengths 24 24 | key stringclasses 1 value | created timestamp[us] | card stringlengths 1 1.01M | embedding list | library_name stringclasses 21 values | pipeline_tag stringclasses 27 values | mask_token null | card_data null | widget_data null | model_index null | config null | transformers_info null | spaces null | safetensors null | transformersInfo null | modelId stringlengths 5 111 ⌀ | embeddings list |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
boborr/twil | boborr | 2023-11-23T18:42:59Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-23T18:42:59Z | 2023-11-23T18:41:47.000Z | 2023-11-23T18:41:47 | ---
license: openrail
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-200step-v2_public | open-llm-leaderboard | 2023-11-23T18:43:38Z | 0 | 0 | null | [
"region:us"
] | 2023-11-23T18:43:38Z | 2023-11-23T18:42:54.000Z | 2023-11-23T18:42:54 | ---
pretty_name: Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-200step-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Korabbit/Llama-2-7b-chat-hf-afr-200step-v2](https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-200step-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-200step-v2_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-23T18:39:46.756166](https://huggingface.co/datasets/open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-200step-v2_public/blob/main/results_2023-11-23T18-39-46.756166.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4844173262075713,\n\
\ \"acc_stderr\": 0.03422515121320321,\n \"acc_norm\": 0.49096056822371376,\n\
\ \"acc_norm_stderr\": 0.03503280881820784,\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.43685304669032105,\n\
\ \"mc2_stderr\": 0.015582536589566296,\n \"em\": 0.02160234899328859,\n\
\ \"em_stderr\": 0.0014888393578850528,\n \"f1\": 0.08137164429530211,\n\
\ \"f1_stderr\": 0.0020119444875776374\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4880546075085324,\n \"acc_stderr\": 0.014607220340597171,\n\
\ \"acc_norm\": 0.5179180887372014,\n \"acc_norm_stderr\": 0.014602005585490978\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5889265086636128,\n\
\ \"acc_stderr\": 0.004910229643262741,\n \"acc_norm\": 0.7741485759808803,\n\
\ \"acc_norm_stderr\": 0.004172872282984212\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.539622641509434,\n \"acc_stderr\": 0.030676096599389184,\n\
\ \"acc_norm\": 0.539622641509434,\n \"acc_norm_stderr\": 0.030676096599389184\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n\
\ \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.5208333333333334,\n\
\ \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4046242774566474,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.4046242774566474,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.032081157507886836,\n\
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.032081157507886836\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.041657747757287644,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.041657747757287644\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.023695415009463087,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.023695415009463087\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924314,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924314\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5290322580645161,\n\
\ \"acc_stderr\": 0.028396016402761005,\n \"acc_norm\": 0.5290322580645161,\n\
\ \"acc_norm_stderr\": 0.028396016402761005\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.03413963805906235,\n\
\ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.03413963805906235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n\
\ \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.601010101010101,\n \"acc_stderr\": 0.034889016168527326,\n \"\
acc_norm\": 0.601010101010101,\n \"acc_norm_stderr\": 0.034889016168527326\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7150259067357513,\n \"acc_stderr\": 0.032577140777096614,\n\
\ \"acc_norm\": 0.7150259067357513,\n \"acc_norm_stderr\": 0.032577140777096614\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4358974358974359,\n \"acc_stderr\": 0.02514180151117749,\n \
\ \"acc_norm\": 0.4358974358974359,\n \"acc_norm_stderr\": 0.02514180151117749\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073835,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073835\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42016806722689076,\n \"acc_stderr\": 0.03206183783236152,\n\
\ \"acc_norm\": 0.42016806722689076,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6770642201834862,\n \"acc_stderr\": 0.02004811592341532,\n \"\
acc_norm\": 0.6770642201834862,\n \"acc_norm_stderr\": 0.02004811592341532\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3287037037037037,\n \"acc_stderr\": 0.032036140846700596,\n \"\
acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.032036140846700596\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6568627450980392,\n \"acc_stderr\": 0.033321399446680854,\n \"\
acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.033321399446680854\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03068582059661079,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03068582059661079\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5739910313901345,\n\
\ \"acc_stderr\": 0.03318833286217281,\n \"acc_norm\": 0.5739910313901345,\n\
\ \"acc_norm_stderr\": 0.03318833286217281\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04750077341199984,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04750077341199984\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.558282208588957,\n \"acc_stderr\": 0.03901591825836184,\n\
\ \"acc_norm\": 0.558282208588957,\n \"acc_norm_stderr\": 0.03901591825836184\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.04582124160161551,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.04582124160161551\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7136752136752137,\n\
\ \"acc_stderr\": 0.02961432369045665,\n \"acc_norm\": 0.7136752136752137,\n\
\ \"acc_norm_stderr\": 0.02961432369045665\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6819923371647509,\n\
\ \"acc_stderr\": 0.01665348627561539,\n \"acc_norm\": 0.6819923371647509,\n\
\ \"acc_norm_stderr\": 0.01665348627561539\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.02690290045866664,\n\
\ \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.02690290045866664\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2223463687150838,\n\
\ \"acc_stderr\": 0.013907189208156881,\n \"acc_norm\": 0.2223463687150838,\n\
\ \"acc_norm_stderr\": 0.013907189208156881\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5130718954248366,\n \"acc_stderr\": 0.028620130800700246,\n\
\ \"acc_norm\": 0.5130718954248366,\n \"acc_norm_stderr\": 0.028620130800700246\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5755627009646302,\n\
\ \"acc_stderr\": 0.028071928247946215,\n \"acc_norm\": 0.5755627009646302,\n\
\ \"acc_norm_stderr\": 0.028071928247946215\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5709876543209876,\n \"acc_stderr\": 0.027538925613470863,\n\
\ \"acc_norm\": 0.5709876543209876,\n \"acc_norm_stderr\": 0.027538925613470863\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36524822695035464,\n \"acc_stderr\": 0.02872386385328128,\n \
\ \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.02872386385328128\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34876140808344197,\n\
\ \"acc_stderr\": 0.01217203515712712,\n \"acc_norm\": 0.34876140808344197,\n\
\ \"acc_norm_stderr\": 0.01217203515712712\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.45588235294117646,\n \"acc_stderr\": 0.030254372573976684,\n\
\ \"acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.030254372573976684\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4934640522875817,\n \"acc_stderr\": 0.020226106567657807,\n \
\ \"acc_norm\": 0.4934640522875817,\n \"acc_norm_stderr\": 0.020226106567657807\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5224489795918368,\n \"acc_stderr\": 0.031976941187136725,\n\
\ \"acc_norm\": 0.5224489795918368,\n \"acc_norm_stderr\": 0.031976941187136725\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n\
\ \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n\
\ \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.03446296217088427,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.03446296217088427\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.43685304669032105,\n\
\ \"mc2_stderr\": 0.015582536589566296\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7190213101815311,\n \"acc_stderr\": 0.012632541095875824\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.02160234899328859,\n \
\ \"em_stderr\": 0.0014888393578850528,\n \"f1\": 0.08137164429530211,\n\
\ \"f1_stderr\": 0.0020119444875776374\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.07884761182714177,\n \"acc_stderr\": 0.007423390519873241\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-200step-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|arc:challenge|25_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|drop|3_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|gsm8k|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hellaswag|10_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T18-39-46.756166.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T18-39-46.756166.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- '**/details_harness|winogrande|5_2023-11-23T18-39-46.756166.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-23T18-39-46.756166.parquet'
- config_name: results
data_files:
- split: 2023_11_23T18_39_46.756166
path:
- results_2023-11-23T18-39-46.756166.parquet
- split: latest
path:
- results_2023-11-23T18-39-46.756166.parquet
---
# Dataset Card for Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-200step-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-200step-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Korabbit/Llama-2-7b-chat-hf-afr-200step-v2](https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-200step-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-200step-v2_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-23T18:39:46.756166](https://huggingface.co/datasets/open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-200step-v2_public/blob/main/results_2023-11-23T18-39-46.756166.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4844173262075713,
"acc_stderr": 0.03422515121320321,
"acc_norm": 0.49096056822371376,
"acc_norm_stderr": 0.03503280881820784,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.43685304669032105,
"mc2_stderr": 0.015582536589566296,
"em": 0.02160234899328859,
"em_stderr": 0.0014888393578850528,
"f1": 0.08137164429530211,
"f1_stderr": 0.0020119444875776374
},
"harness|arc:challenge|25": {
"acc": 0.4880546075085324,
"acc_stderr": 0.014607220340597171,
"acc_norm": 0.5179180887372014,
"acc_norm_stderr": 0.014602005585490978
},
"harness|hellaswag|10": {
"acc": 0.5889265086636128,
"acc_stderr": 0.004910229643262741,
"acc_norm": 0.7741485759808803,
"acc_norm_stderr": 0.004172872282984212
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.539622641509434,
"acc_stderr": 0.030676096599389184,
"acc_norm": 0.539622641509434,
"acc_norm_stderr": 0.030676096599389184
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5208333333333334,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.5208333333333334,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4046242774566474,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.4046242774566474,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.032081157507886836,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.032081157507886836
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.041657747757287644,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.041657747757287644
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.023695415009463087,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.023695415009463087
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924314,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924314
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5290322580645161,
"acc_stderr": 0.028396016402761005,
"acc_norm": 0.5290322580645161,
"acc_norm_stderr": 0.028396016402761005
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.03413963805906235,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.03413963805906235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.601010101010101,
"acc_stderr": 0.034889016168527326,
"acc_norm": 0.601010101010101,
"acc_norm_stderr": 0.034889016168527326
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7150259067357513,
"acc_stderr": 0.032577140777096614,
"acc_norm": 0.7150259067357513,
"acc_norm_stderr": 0.032577140777096614
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4358974358974359,
"acc_stderr": 0.02514180151117749,
"acc_norm": 0.4358974358974359,
"acc_norm_stderr": 0.02514180151117749
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073835,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073835
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42016806722689076,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.42016806722689076,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6770642201834862,
"acc_stderr": 0.02004811592341532,
"acc_norm": 0.6770642201834862,
"acc_norm_stderr": 0.02004811592341532
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3287037037037037,
"acc_stderr": 0.032036140846700596,
"acc_norm": 0.3287037037037037,
"acc_norm_stderr": 0.032036140846700596
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.033321399446680854,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.033321399446680854
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03068582059661079,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03068582059661079
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5739910313901345,
"acc_stderr": 0.03318833286217281,
"acc_norm": 0.5739910313901345,
"acc_norm_stderr": 0.03318833286217281
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04750077341199984,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04750077341199984
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.558282208588957,
"acc_stderr": 0.03901591825836184,
"acc_norm": 0.558282208588957,
"acc_norm_stderr": 0.03901591825836184
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.04582124160161551,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.04582124160161551
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7136752136752137,
"acc_stderr": 0.02961432369045665,
"acc_norm": 0.7136752136752137,
"acc_norm_stderr": 0.02961432369045665
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6819923371647509,
"acc_stderr": 0.01665348627561539,
"acc_norm": 0.6819923371647509,
"acc_norm_stderr": 0.01665348627561539
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5173410404624278,
"acc_stderr": 0.02690290045866664,
"acc_norm": 0.5173410404624278,
"acc_norm_stderr": 0.02690290045866664
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2223463687150838,
"acc_stderr": 0.013907189208156881,
"acc_norm": 0.2223463687150838,
"acc_norm_stderr": 0.013907189208156881
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5130718954248366,
"acc_stderr": 0.028620130800700246,
"acc_norm": 0.5130718954248366,
"acc_norm_stderr": 0.028620130800700246
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5755627009646302,
"acc_stderr": 0.028071928247946215,
"acc_norm": 0.5755627009646302,
"acc_norm_stderr": 0.028071928247946215
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5709876543209876,
"acc_stderr": 0.027538925613470863,
"acc_norm": 0.5709876543209876,
"acc_norm_stderr": 0.027538925613470863
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.02872386385328128,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.02872386385328128
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34876140808344197,
"acc_stderr": 0.01217203515712712,
"acc_norm": 0.34876140808344197,
"acc_norm_stderr": 0.01217203515712712
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45588235294117646,
"acc_stderr": 0.030254372573976684,
"acc_norm": 0.45588235294117646,
"acc_norm_stderr": 0.030254372573976684
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4934640522875817,
"acc_stderr": 0.020226106567657807,
"acc_norm": 0.4934640522875817,
"acc_norm_stderr": 0.020226106567657807
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5224489795918368,
"acc_stderr": 0.031976941187136725,
"acc_norm": 0.5224489795918368,
"acc_norm_stderr": 0.031976941187136725
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6467661691542289,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.6467661691542289,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.03446296217088427,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.03446296217088427
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.43685304669032105,
"mc2_stderr": 0.015582536589566296
},
"harness|winogrande|5": {
"acc": 0.7190213101815311,
"acc_stderr": 0.012632541095875824
},
"harness|drop|3": {
"em": 0.02160234899328859,
"em_stderr": 0.0014888393578850528,
"f1": 0.08137164429530211,
"f1_stderr": 0.0020119444875776374
},
"harness|gsm8k|5": {
"acc": 0.07884761182714177,
"acc_stderr": 0.007423390519873241
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.6868388056755066,
-0.8974345326423645,
0.2833056151866913,
0.22936776280403137,
-0.19739116728305817,
-0.0076848044991493225,
0.022225193679332733,
-0.23338337242603302,
0.5915926694869995,
-0.002853990066796541,
-0.4886975586414337,
-0.6730028390884399,
-0.4480588436126709,
0.227270722... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-100step-v2_public | open-llm-leaderboard | 2023-11-23T18:53:26Z | 0 | 0 | null | [
"region:us"
] | 2023-11-23T18:53:26Z | 2023-11-23T18:52:43.000Z | 2023-11-23T18:52:43 | ---
pretty_name: Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-100step-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Korabbit/Llama-2-7b-chat-hf-afr-100step-v2](https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-100step-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-100step-v2_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-23T18:49:40.471713](https://huggingface.co/datasets/open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-100step-v2_public/blob/main/results_2023-11-23T18-49-40.471713.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4839603798631096,\n\
\ \"acc_stderr\": 0.034233481703847386,\n \"acc_norm\": 0.4904194469293332,\n\
\ \"acc_norm_stderr\": 0.0350370635088906,\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361005,\n \"mc2\": 0.4518194385088943,\n\
\ \"mc2_stderr\": 0.01565368058265292,\n \"em\": 0.04016359060402685,\n\
\ \"em_stderr\": 0.002010733562468151,\n \"f1\": 0.10108536073825498,\n\
\ \"f1_stderr\": 0.0024087765856211545\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.492320819112628,\n \"acc_stderr\": 0.01460966744089257,\n\
\ \"acc_norm\": 0.5264505119453925,\n \"acc_norm_stderr\": 0.014590931358120167\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5955984863572994,\n\
\ \"acc_stderr\": 0.004897728370737241,\n \"acc_norm\": 0.782513443537144,\n\
\ \"acc_norm_stderr\": 0.0041169313831573495\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5471698113207547,\n \"acc_stderr\": 0.03063562795796182,\n\
\ \"acc_norm\": 0.5471698113207547,\n \"acc_norm_stderr\": 0.03063562795796182\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n\
\ \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.5208333333333334,\n\
\ \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4046242774566474,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.4046242774566474,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n\
\ \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.045595221419582166,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.045595221419582166\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.023636975996101806,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.023636975996101806\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924314,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924314\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5225806451612903,\n \"acc_stderr\": 0.02841498501970786,\n \"\
acc_norm\": 0.5225806451612903,\n \"acc_norm_stderr\": 0.02841498501970786\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.35960591133004927,\n \"acc_stderr\": 0.033764582465095665,\n \"\
acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.033764582465095665\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5878787878787879,\n \"acc_stderr\": 0.03843566993588717,\n\
\ \"acc_norm\": 0.5878787878787879,\n \"acc_norm_stderr\": 0.03843566993588717\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6060606060606061,\n \"acc_stderr\": 0.034812853382329624,\n \"\
acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.034812853382329624\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7150259067357513,\n \"acc_stderr\": 0.032577140777096614,\n\
\ \"acc_norm\": 0.7150259067357513,\n \"acc_norm_stderr\": 0.032577140777096614\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4256410256410256,\n \"acc_stderr\": 0.02506909438729654,\n \
\ \"acc_norm\": 0.4256410256410256,\n \"acc_norm_stderr\": 0.02506909438729654\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42016806722689076,\n \"acc_stderr\": 0.03206183783236152,\n\
\ \"acc_norm\": 0.42016806722689076,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.036848815213890225,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.036848815213890225\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6788990825688074,\n \"acc_stderr\": 0.02001814977273375,\n \"\
acc_norm\": 0.6788990825688074,\n \"acc_norm_stderr\": 0.02001814977273375\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.0321495214780275,\n \"acc_norm\"\
: 0.3333333333333333,\n \"acc_norm_stderr\": 0.0321495214780275\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03308611113236434,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03308611113236434\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.6751054852320675,\n \"acc_stderr\": 0.03048603938910529,\n\
\ \"acc_norm\": 0.6751054852320675,\n \"acc_norm_stderr\": 0.03048603938910529\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n\
\ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.5695067264573991,\n\
\ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04750077341199984,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04750077341199984\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.558282208588957,\n \"acc_stderr\": 0.03901591825836184,\n\
\ \"acc_norm\": 0.558282208588957,\n \"acc_norm_stderr\": 0.03901591825836184\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.04656147110012351,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.04656147110012351\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.717948717948718,\n\
\ \"acc_stderr\": 0.029480360549541194,\n \"acc_norm\": 0.717948717948718,\n\
\ \"acc_norm_stderr\": 0.029480360549541194\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6781609195402298,\n\
\ \"acc_stderr\": 0.0167063814150579,\n \"acc_norm\": 0.6781609195402298,\n\
\ \"acc_norm_stderr\": 0.0167063814150579\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.026907849856282542,\n\
\ \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.026907849856282542\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2212290502793296,\n\
\ \"acc_stderr\": 0.013882164598887275,\n \"acc_norm\": 0.2212290502793296,\n\
\ \"acc_norm_stderr\": 0.013882164598887275\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5163398692810458,\n \"acc_stderr\": 0.02861462475280544,\n\
\ \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.02861462475280544\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5755627009646302,\n\
\ \"acc_stderr\": 0.028071928247946215,\n \"acc_norm\": 0.5755627009646302,\n\
\ \"acc_norm_stderr\": 0.028071928247946215\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5679012345679012,\n \"acc_stderr\": 0.027563010971606676,\n\
\ \"acc_norm\": 0.5679012345679012,\n \"acc_norm_stderr\": 0.027563010971606676\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251458,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251458\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3494132985658409,\n\
\ \"acc_stderr\": 0.012177306252786686,\n \"acc_norm\": 0.3494132985658409,\n\
\ \"acc_norm_stderr\": 0.012177306252786686\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.45588235294117646,\n \"acc_stderr\": 0.03025437257397668,\n\
\ \"acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.03025437257397668\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4852941176470588,\n \"acc_stderr\": 0.020219083895133924,\n \
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.020219083895133924\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5265306122448979,\n \"acc_stderr\": 0.03196412734523272,\n\
\ \"acc_norm\": 0.5265306122448979,\n \"acc_norm_stderr\": 0.03196412734523272\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n\
\ \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n\
\ \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.03446296217088427,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.03446296217088427\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361005,\n \"mc2\": 0.4518194385088943,\n\
\ \"mc2_stderr\": 0.01565368058265292\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7229676400947119,\n \"acc_stderr\": 0.012577891015342412\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.04016359060402685,\n \
\ \"em_stderr\": 0.002010733562468151,\n \"f1\": 0.10108536073825498,\n\
\ \"f1_stderr\": 0.0024087765856211545\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.08491281273692192,\n \"acc_stderr\": 0.007678212824450797\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-100step-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|arc:challenge|25_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|drop|3_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|gsm8k|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hellaswag|10_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T18-49-40.471713.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T18-49-40.471713.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- '**/details_harness|winogrande|5_2023-11-23T18-49-40.471713.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-23T18-49-40.471713.parquet'
- config_name: results
data_files:
- split: 2023_11_23T18_49_40.471713
path:
- results_2023-11-23T18-49-40.471713.parquet
- split: latest
path:
- results_2023-11-23T18-49-40.471713.parquet
---
# Dataset Card for Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-100step-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-100step-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Korabbit/Llama-2-7b-chat-hf-afr-100step-v2](https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-100step-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-100step-v2_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-23T18:49:40.471713](https://huggingface.co/datasets/open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-100step-v2_public/blob/main/results_2023-11-23T18-49-40.471713.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4839603798631096,
"acc_stderr": 0.034233481703847386,
"acc_norm": 0.4904194469293332,
"acc_norm_stderr": 0.0350370635088906,
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361005,
"mc2": 0.4518194385088943,
"mc2_stderr": 0.01565368058265292,
"em": 0.04016359060402685,
"em_stderr": 0.002010733562468151,
"f1": 0.10108536073825498,
"f1_stderr": 0.0024087765856211545
},
"harness|arc:challenge|25": {
"acc": 0.492320819112628,
"acc_stderr": 0.01460966744089257,
"acc_norm": 0.5264505119453925,
"acc_norm_stderr": 0.014590931358120167
},
"harness|hellaswag|10": {
"acc": 0.5955984863572994,
"acc_stderr": 0.004897728370737241,
"acc_norm": 0.782513443537144,
"acc_norm_stderr": 0.0041169313831573495
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5471698113207547,
"acc_stderr": 0.03063562795796182,
"acc_norm": 0.5471698113207547,
"acc_norm_stderr": 0.03063562795796182
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5208333333333334,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.5208333333333334,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4046242774566474,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.4046242774566474,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4085106382978723,
"acc_stderr": 0.03213418026701576,
"acc_norm": 0.4085106382978723,
"acc_norm_stderr": 0.03213418026701576
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.045595221419582166,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.045595221419582166
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.023636975996101806,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.023636975996101806
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924314,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924314
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5225806451612903,
"acc_stderr": 0.02841498501970786,
"acc_norm": 0.5225806451612903,
"acc_norm_stderr": 0.02841498501970786
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.033764582465095665,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.033764582465095665
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5878787878787879,
"acc_stderr": 0.03843566993588717,
"acc_norm": 0.5878787878787879,
"acc_norm_stderr": 0.03843566993588717
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.034812853382329624,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.034812853382329624
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7150259067357513,
"acc_stderr": 0.032577140777096614,
"acc_norm": 0.7150259067357513,
"acc_norm_stderr": 0.032577140777096614
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4256410256410256,
"acc_stderr": 0.02506909438729654,
"acc_norm": 0.4256410256410256,
"acc_norm_stderr": 0.02506909438729654
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42016806722689076,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.42016806722689076,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.036848815213890225,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.036848815213890225
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6788990825688074,
"acc_stderr": 0.02001814977273375,
"acc_norm": 0.6788990825688074,
"acc_norm_stderr": 0.02001814977273375
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0321495214780275,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0321495214780275
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236434,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236434
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6751054852320675,
"acc_stderr": 0.03048603938910529,
"acc_norm": 0.6751054852320675,
"acc_norm_stderr": 0.03048603938910529
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5695067264573991,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.5695067264573991,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04750077341199984,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04750077341199984
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.558282208588957,
"acc_stderr": 0.03901591825836184,
"acc_norm": 0.558282208588957,
"acc_norm_stderr": 0.03901591825836184
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.04656147110012351,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.04656147110012351
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.717948717948718,
"acc_stderr": 0.029480360549541194,
"acc_norm": 0.717948717948718,
"acc_norm_stderr": 0.029480360549541194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6781609195402298,
"acc_stderr": 0.0167063814150579,
"acc_norm": 0.6781609195402298,
"acc_norm_stderr": 0.0167063814150579
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.026907849856282542,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.026907849856282542
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2212290502793296,
"acc_stderr": 0.013882164598887275,
"acc_norm": 0.2212290502793296,
"acc_norm_stderr": 0.013882164598887275
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.02861462475280544,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.02861462475280544
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5755627009646302,
"acc_stderr": 0.028071928247946215,
"acc_norm": 0.5755627009646302,
"acc_norm_stderr": 0.028071928247946215
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5679012345679012,
"acc_stderr": 0.027563010971606676,
"acc_norm": 0.5679012345679012,
"acc_norm_stderr": 0.027563010971606676
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.028838921471251458,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.028838921471251458
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3494132985658409,
"acc_stderr": 0.012177306252786686,
"acc_norm": 0.3494132985658409,
"acc_norm_stderr": 0.012177306252786686
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45588235294117646,
"acc_stderr": 0.03025437257397668,
"acc_norm": 0.45588235294117646,
"acc_norm_stderr": 0.03025437257397668
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.020219083895133924,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.020219083895133924
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5265306122448979,
"acc_stderr": 0.03196412734523272,
"acc_norm": 0.5265306122448979,
"acc_norm_stderr": 0.03196412734523272
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6467661691542289,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.6467661691542289,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.03446296217088427,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.03446296217088427
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361005,
"mc2": 0.4518194385088943,
"mc2_stderr": 0.01565368058265292
},
"harness|winogrande|5": {
"acc": 0.7229676400947119,
"acc_stderr": 0.012577891015342412
},
"harness|drop|3": {
"em": 0.04016359060402685,
"em_stderr": 0.002010733562468151,
"f1": 0.10108536073825498,
"f1_stderr": 0.0024087765856211545
},
"harness|gsm8k|5": {
"acc": 0.08491281273692192,
"acc_stderr": 0.007678212824450797
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.6962785124778748,
-0.909622073173523,
0.2704402506351471,
0.22720259428024292,
-0.1899634748697281,
-0.012054228223860264,
0.016602469608187675,
-0.2410285621881485,
0.5974166989326477,
-0.010371976532042027,
-0.48733508586883545,
-0.6742585301399231,
-0.4375360310077667,
0.227247908711... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
whuil1213/1122 | whuil1213 | 2023-11-23T18:56:51Z | 0 | 0 | null | [
"region:us"
] | 2023-11-23T18:56:51Z | 2023-11-23T18:56:51.000Z | 2023-11-23T18:56:51 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_speechlessai__speechless-mistral-7b-dare-0.85_public | open-llm-leaderboard | 2023-11-23T19:04:13Z | 0 | 0 | null | [
"region:us"
] | 2023-11-23T19:04:13Z | 2023-11-23T19:03:28.000Z | 2023-11-23T19:03:28 | ---
pretty_name: Evaluation run of speechlessai/speechless-mistral-7b-dare-0.85
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [speechlessai/speechless-mistral-7b-dare-0.85](https://huggingface.co/speechlessai/speechless-mistral-7b-dare-0.85)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_speechlessai__speechless-mistral-7b-dare-0.85_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-23T19:00:24.923358](https://huggingface.co/datasets/open-llm-leaderboard/details_speechlessai__speechless-mistral-7b-dare-0.85_public/blob/main/results_2023-11-23T19-00-24.923358.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6369591583509581,\n\
\ \"acc_stderr\": 0.03209160104558823,\n \"acc_norm\": 0.6455116611491236,\n\
\ \"acc_norm_stderr\": 0.0327770298036848,\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.5067853019722414,\n\
\ \"mc2_stderr\": 0.015079174812087311,\n \"em\": 0.034395973154362415,\n\
\ \"em_stderr\": 0.0018663495487686948,\n \"f1\": 0.10012898489932888,\n\
\ \"f1_stderr\": 0.002256552299533148\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.606655290102389,\n \"acc_stderr\": 0.014275101465693026,\n\
\ \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.014084133118104298\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6532563234415455,\n\
\ \"acc_stderr\": 0.004749606196363343,\n \"acc_norm\": 0.8493328022306313,\n\
\ \"acc_norm_stderr\": 0.003569930987961452\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894444,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894444\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642525,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229876,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229876\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033463,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033463\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633507,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633507\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.818348623853211,\n \"acc_stderr\": 0.016530617409266878,\n \"\
acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266878\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.02636165166838909,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.02636165166838909\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3564245810055866,\n\
\ \"acc_stderr\": 0.016018239710513395,\n \"acc_norm\": 0.3564245810055866,\n\
\ \"acc_norm_stderr\": 0.016018239710513395\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.02440439492808787,\n\
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.02440439492808787\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.455019556714472,\n\
\ \"acc_stderr\": 0.012718456618701766,\n \"acc_norm\": 0.455019556714472,\n\
\ \"acc_norm_stderr\": 0.012718456618701766\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.02850145286039655,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.02850145286039655\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6601307189542484,\n \"acc_stderr\": 0.01916241858862356,\n \
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.01916241858862356\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252089,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252089\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.5067853019722414,\n\
\ \"mc2_stderr\": 0.015079174812087311\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7932123125493291,\n \"acc_stderr\": 0.011382566829235803\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.034395973154362415,\n \
\ \"em_stderr\": 0.0018663495487686948,\n \"f1\": 0.10012898489932888,\n\
\ \"f1_stderr\": 0.002256552299533148\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.19863532979529946,\n \"acc_stderr\": 0.010989694978252754\n\
\ }\n}\n```"
repo_url: https://huggingface.co/speechlessai/speechless-mistral-7b-dare-0.85
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|arc:challenge|25_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|drop|3_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|gsm8k|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hellaswag|10_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-00-24.923358.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T19-00-24.923358.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- '**/details_harness|winogrande|5_2023-11-23T19-00-24.923358.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-23T19-00-24.923358.parquet'
- config_name: results
data_files:
- split: 2023_11_23T19_00_24.923358
path:
- results_2023-11-23T19-00-24.923358.parquet
- split: latest
path:
- results_2023-11-23T19-00-24.923358.parquet
---
# Dataset Card for Evaluation run of speechlessai/speechless-mistral-7b-dare-0.85
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/speechlessai/speechless-mistral-7b-dare-0.85
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [speechlessai/speechless-mistral-7b-dare-0.85](https://huggingface.co/speechlessai/speechless-mistral-7b-dare-0.85) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_speechlessai__speechless-mistral-7b-dare-0.85_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-23T19:00:24.923358](https://huggingface.co/datasets/open-llm-leaderboard/details_speechlessai__speechless-mistral-7b-dare-0.85_public/blob/main/results_2023-11-23T19-00-24.923358.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6369591583509581,
"acc_stderr": 0.03209160104558823,
"acc_norm": 0.6455116611491236,
"acc_norm_stderr": 0.0327770298036848,
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.5067853019722414,
"mc2_stderr": 0.015079174812087311,
"em": 0.034395973154362415,
"em_stderr": 0.0018663495487686948,
"f1": 0.10012898489932888,
"f1_stderr": 0.002256552299533148
},
"harness|arc:challenge|25": {
"acc": 0.606655290102389,
"acc_stderr": 0.014275101465693026,
"acc_norm": 0.6331058020477816,
"acc_norm_stderr": 0.014084133118104298
},
"harness|hellaswag|10": {
"acc": 0.6532563234415455,
"acc_stderr": 0.004749606196363343,
"acc_norm": 0.8493328022306313,
"acc_norm_stderr": 0.003569930987961452
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894444,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642525,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229876,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229876
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033463,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033463
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633507,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633507
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251976,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251976
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266878,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266878
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.02636165166838909,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.02636165166838909
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973136,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3564245810055866,
"acc_stderr": 0.016018239710513395,
"acc_norm": 0.3564245810055866,
"acc_norm_stderr": 0.016018239710513395
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.02440439492808787,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.02440439492808787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.455019556714472,
"acc_stderr": 0.012718456618701766,
"acc_norm": 0.455019556714472,
"acc_norm_stderr": 0.012718456618701766
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.02850145286039655,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.02850145286039655
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.01916241858862356,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.01916241858862356
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252089,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252089
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482706,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.5067853019722414,
"mc2_stderr": 0.015079174812087311
},
"harness|winogrande|5": {
"acc": 0.7932123125493291,
"acc_stderr": 0.011382566829235803
},
"harness|drop|3": {
"em": 0.034395973154362415,
"em_stderr": 0.0018663495487686948,
"f1": 0.10012898489932888,
"f1_stderr": 0.002256552299533148
},
"harness|gsm8k|5": {
"acc": 0.19863532979529946,
"acc_stderr": 0.010989694978252754
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7032472491264343,
-0.8564736247062683,
0.27099525928497314,
0.25537535548210144,
-0.178325355052948,
-0.09935709089040756,
0.006960620637983084,
-0.22016648948192596,
0.5373685956001282,
-0.03019183874130249,
-0.49443912506103516,
-0.707866907119751,
-0.4021880328655243,
0.2323739230632... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
anand-kamble/Custom_common_voice_dataset_using_RVC | anand-kamble | 2023-11-23T20:05:47Z | 0 | 0 | null | [
"size_categories:10K<n<100K",
"source_datasets:common_voice_v13",
"language:hi",
"license:cc0-1.0",
"region:us"
] | 2023-11-23T20:05:47Z | 2023-11-23T19:05:25.000Z | 2023-11-23T19:05:25 | ---
license: cc0-1.0
language:
- hi
pretty_name: Custom Common Voice
size_categories:
- 10K<n<100K
viewer: true
source_datasets:
- common_voice_v13
---
### Licensing Information
Public Domain, [CC-0](https://creativecommons.org/share-your-work/public-domain/cc0/)
### Citation Information
```
@inproceedings{commonvoice:2020,
author = {Ardila, R. and Branson, M. and Davis, K. and Henretty, M. and Kohler, M. and Meyer, J. and Morais, R. and Saunders, L. and Tyers, F. M. and Weber, G.},
title = {Common Voice: A Massively-Multilingual Speech Corpus},
booktitle = {Proceedings of the 12th Conference on Language Resources and Evaluation (LREC 2020)},
pages = {4211--4215},
year = 2020
}
``` | [
-0.4885442852973938,
-0.35294458270072937,
0.33604246377944946,
0.3865574598312378,
-0.21295322477817535,
0.04444192722439766,
-0.4245809316635132,
-0.42394769191741943,
0.20283187925815582,
0.5575846433639526,
-0.3198986351490021,
-0.5811721086502075,
-0.539591372013092,
0.285639971494674... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
zakhtar/stanford-test | zakhtar | 2023-11-23T19:11:42Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-23T19:11:42Z | 2023-11-23T19:11:42.000Z | 2023-11-23T19:11:42 | ---
license: mit
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_souvik0306__mistral_7b_2epoch_norobots_public | open-llm-leaderboard | 2023-11-23T19:21:52Z | 0 | 0 | null | [
"region:us"
] | 2023-11-23T19:21:52Z | 2023-11-23T19:21:07.000Z | 2023-11-23T19:21:07 | ---
pretty_name: Evaluation run of souvik0306/mistral_7b_2epoch_norobots
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [souvik0306/mistral_7b_2epoch_norobots](https://huggingface.co/souvik0306/mistral_7b_2epoch_norobots)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_souvik0306__mistral_7b_2epoch_norobots_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-23T19:18:06.825101](https://huggingface.co/datasets/open-llm-leaderboard/details_souvik0306__mistral_7b_2epoch_norobots_public/blob/main/results_2023-11-23T19-18-06.825101.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6330598338387582,\n\
\ \"acc_stderr\": 0.03226570631734972,\n \"acc_norm\": 0.6423858579070316,\n\
\ \"acc_norm_stderr\": 0.0329680806753492,\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.015702107090627897,\n \"mc2\": 0.4261552372929774,\n\
\ \"mc2_stderr\": 0.014190532295151336,\n \"em\": 0.0016778523489932886,\n\
\ \"em_stderr\": 0.00041913301788268467,\n \"f1\": 0.062363674496644275,\n\
\ \"f1_stderr\": 0.0013875357781658866\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5708191126279863,\n \"acc_stderr\": 0.014464085894870653,\n\
\ \"acc_norm\": 0.6100682593856656,\n \"acc_norm_stderr\": 0.014252959848892894\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6281617207727545,\n\
\ \"acc_stderr\": 0.004823078145064964,\n \"acc_norm\": 0.833698466440948,\n\
\ \"acc_norm_stderr\": 0.0037159010850549875\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601688,\n \"\
acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601688\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n\
\ \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n\
\ \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635474,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635474\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010354,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010354\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n\
\ \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n\
\ \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906943,\n\
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906943\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822915,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822915\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709698,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709698\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281386,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281386\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.014036945850381387,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.014036945850381387\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3039106145251397,\n\
\ \"acc_stderr\": 0.01538284558758452,\n \"acc_norm\": 0.3039106145251397,\n\
\ \"acc_norm_stderr\": 0.01538284558758452\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.026082700695399662,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.026082700695399662\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.02456922360046085,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.02456922360046085\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n\
\ \"acc_stderr\": 0.012685906538206247,\n \"acc_norm\": 0.4426336375488918,\n\
\ \"acc_norm_stderr\": 0.012685906538206247\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000318,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000318\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.015702107090627897,\n \"mc2\": 0.4261552372929774,\n\
\ \"mc2_stderr\": 0.014190532295151336\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7908445146014207,\n \"acc_stderr\": 0.01143045004588158\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0016778523489932886,\n \
\ \"em_stderr\": 0.00041913301788268467,\n \"f1\": 0.062363674496644275,\n\
\ \"f1_stderr\": 0.0013875357781658866\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.16982562547384383,\n \"acc_stderr\": 0.010342572360861205\n\
\ }\n}\n```"
repo_url: https://huggingface.co/souvik0306/mistral_7b_2epoch_norobots
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|arc:challenge|25_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|drop|3_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|gsm8k|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hellaswag|10_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-18-06.825101.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T19-18-06.825101.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- '**/details_harness|winogrande|5_2023-11-23T19-18-06.825101.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-23T19-18-06.825101.parquet'
- config_name: results
data_files:
- split: 2023_11_23T19_18_06.825101
path:
- results_2023-11-23T19-18-06.825101.parquet
- split: latest
path:
- results_2023-11-23T19-18-06.825101.parquet
---
# Dataset Card for Evaluation run of souvik0306/mistral_7b_2epoch_norobots
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/souvik0306/mistral_7b_2epoch_norobots
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [souvik0306/mistral_7b_2epoch_norobots](https://huggingface.co/souvik0306/mistral_7b_2epoch_norobots) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_souvik0306__mistral_7b_2epoch_norobots_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-23T19:18:06.825101](https://huggingface.co/datasets/open-llm-leaderboard/details_souvik0306__mistral_7b_2epoch_norobots_public/blob/main/results_2023-11-23T19-18-06.825101.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6330598338387582,
"acc_stderr": 0.03226570631734972,
"acc_norm": 0.6423858579070316,
"acc_norm_stderr": 0.0329680806753492,
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627897,
"mc2": 0.4261552372929774,
"mc2_stderr": 0.014190532295151336,
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788268467,
"f1": 0.062363674496644275,
"f1_stderr": 0.0013875357781658866
},
"harness|arc:challenge|25": {
"acc": 0.5708191126279863,
"acc_stderr": 0.014464085894870653,
"acc_norm": 0.6100682593856656,
"acc_norm_stderr": 0.014252959848892894
},
"harness|hellaswag|10": {
"acc": 0.6281617207727545,
"acc_stderr": 0.004823078145064964,
"acc_norm": 0.833698466440948,
"acc_norm_stderr": 0.0037159010850549875
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601688,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601688
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635474,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635474
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985741,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985741
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010354,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010354
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906943,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906943
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822915,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822915
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709698,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709698
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281386,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281386
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381387,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381387
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.02418242749657761,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.02418242749657761
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3039106145251397,
"acc_stderr": 0.01538284558758452,
"acc_norm": 0.3039106145251397,
"acc_norm_stderr": 0.01538284558758452
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399662,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399662
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.02456922360046085,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.02456922360046085
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4426336375488918,
"acc_stderr": 0.012685906538206247,
"acc_norm": 0.4426336375488918,
"acc_norm_stderr": 0.012685906538206247
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000318,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000318
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627897,
"mc2": 0.4261552372929774,
"mc2_stderr": 0.014190532295151336
},
"harness|winogrande|5": {
"acc": 0.7908445146014207,
"acc_stderr": 0.01143045004588158
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788268467,
"f1": 0.062363674496644275,
"f1_stderr": 0.0013875357781658866
},
"harness|gsm8k|5": {
"acc": 0.16982562547384383,
"acc_stderr": 0.010342572360861205
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.6919941306114197,
-0.8404462337493896,
0.2667767405509949,
0.20768991112709045,
-0.18128274381160736,
-0.10106763988733292,
0.029694609344005585,
-0.20680734515190125,
0.5660512447357178,
0.020192841067910194,
-0.5052782297134399,
-0.7209584712982178,
-0.4492606520652771,
0.219303712248... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_uukuguy__CollectiveCognition-v1.1-Mistral-7B-dare-0.85_public | open-llm-leaderboard | 2023-11-23T19:23:08Z | 0 | 0 | null | [
"region:us"
] | 2023-11-23T19:23:08Z | 2023-11-23T19:22:22.000Z | 2023-11-23T19:22:22 | ---
pretty_name: Evaluation run of uukuguy/CollectiveCognition-v1.1-Mistral-7B-dare-0.85
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/CollectiveCognition-v1.1-Mistral-7B-dare-0.85](https://huggingface.co/uukuguy/CollectiveCognition-v1.1-Mistral-7B-dare-0.85)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__CollectiveCognition-v1.1-Mistral-7B-dare-0.85_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-23T19:19:22.420919](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__CollectiveCognition-v1.1-Mistral-7B-dare-0.85_public/blob/main/results_2023-11-23T19-19-22.420919.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6373539881235634,\n\
\ \"acc_stderr\": 0.032200043467933794,\n \"acc_norm\": 0.6462425671540708,\n\
\ \"acc_norm_stderr\": 0.032891781056948864,\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.44867041308885225,\n\
\ \"mc2_stderr\": 0.014511741253113358,\n \"em\": 0.001572986577181208,\n\
\ \"em_stderr\": 0.00040584511324177333,\n \"f1\": 0.06318477348993282,\n\
\ \"f1_stderr\": 0.0013946687452644612\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5802047781569966,\n \"acc_stderr\": 0.014422181226303026,\n\
\ \"acc_norm\": 0.6100682593856656,\n \"acc_norm_stderr\": 0.014252959848892893\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6451902011551484,\n\
\ \"acc_stderr\": 0.004774778180345194,\n \"acc_norm\": 0.8430591515634336,\n\
\ \"acc_norm_stderr\": 0.0036300159898963996\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.038781398887976104,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.038781398887976104\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137282,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137282\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612927,\n \"\
acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612927\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.033922384053216174,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.033922384053216174\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808514,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808514\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934724,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934724\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973133,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973133\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32737430167597764,\n\
\ \"acc_stderr\": 0.015694238967737383,\n \"acc_norm\": 0.32737430167597764,\n\
\ \"acc_norm_stderr\": 0.015694238967737383\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824775,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824775\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984824,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984824\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45371577574967403,\n\
\ \"acc_stderr\": 0.012715404841277738,\n \"acc_norm\": 0.45371577574967403,\n\
\ \"acc_norm_stderr\": 0.012715404841277738\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160896,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160896\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.44867041308885225,\n\
\ \"mc2_stderr\": 0.014511741253113358\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7884767166535123,\n \"acc_stderr\": 0.011477747684223194\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.001572986577181208,\n \
\ \"em_stderr\": 0.00040584511324177333,\n \"f1\": 0.06318477348993282,\n\
\ \"f1_stderr\": 0.0013946687452644612\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.18953752843062927,\n \"acc_stderr\": 0.010795837931896386\n\
\ }\n}\n```"
repo_url: https://huggingface.co/uukuguy/CollectiveCognition-v1.1-Mistral-7B-dare-0.85
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|arc:challenge|25_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|drop|3_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|gsm8k|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hellaswag|10_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|winogrande|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-23T19-19-22.420919.parquet'
- config_name: results
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- results_2023-11-23T19-19-22.420919.parquet
- split: latest
path:
- results_2023-11-23T19-19-22.420919.parquet
---
# Dataset Card for Evaluation run of uukuguy/CollectiveCognition-v1.1-Mistral-7B-dare-0.85
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/CollectiveCognition-v1.1-Mistral-7B-dare-0.85
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/CollectiveCognition-v1.1-Mistral-7B-dare-0.85](https://huggingface.co/uukuguy/CollectiveCognition-v1.1-Mistral-7B-dare-0.85) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__CollectiveCognition-v1.1-Mistral-7B-dare-0.85_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-23T19:19:22.420919](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__CollectiveCognition-v1.1-Mistral-7B-dare-0.85_public/blob/main/results_2023-11-23T19-19-22.420919.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6373539881235634,
"acc_stderr": 0.032200043467933794,
"acc_norm": 0.6462425671540708,
"acc_norm_stderr": 0.032891781056948864,
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.44867041308885225,
"mc2_stderr": 0.014511741253113358,
"em": 0.001572986577181208,
"em_stderr": 0.00040584511324177333,
"f1": 0.06318477348993282,
"f1_stderr": 0.0013946687452644612
},
"harness|arc:challenge|25": {
"acc": 0.5802047781569966,
"acc_stderr": 0.014422181226303026,
"acc_norm": 0.6100682593856656,
"acc_norm_stderr": 0.014252959848892893
},
"harness|hellaswag|10": {
"acc": 0.6451902011551484,
"acc_stderr": 0.004774778180345194,
"acc_norm": 0.8430591515634336,
"acc_norm_stderr": 0.0036300159898963996
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.038781398887976104,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.038781398887976104
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137282,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137282
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612927,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612927
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.033922384053216174,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.033922384053216174
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808514,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808514
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.03192193448934724,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.03192193448934724
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973133,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973133
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32737430167597764,
"acc_stderr": 0.015694238967737383,
"acc_norm": 0.32737430167597764,
"acc_norm_stderr": 0.015694238967737383
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824775,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824775
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984824,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984824
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45371577574967403,
"acc_stderr": 0.012715404841277738,
"acc_norm": 0.45371577574967403,
"acc_norm_stderr": 0.012715404841277738
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160896,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160896
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.44867041308885225,
"mc2_stderr": 0.014511741253113358
},
"harness|winogrande|5": {
"acc": 0.7884767166535123,
"acc_stderr": 0.011477747684223194
},
"harness|drop|3": {
"em": 0.001572986577181208,
"em_stderr": 0.00040584511324177333,
"f1": 0.06318477348993282,
"f1_stderr": 0.0013946687452644612
},
"harness|gsm8k|5": {
"acc": 0.18953752843062927,
"acc_stderr": 0.010795837931896386
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.6899695992469788,
-0.8110079765319824,
0.2939409911632538,
0.19302727282047272,
-0.20352885127067566,
-0.0552542507648468,
0.0008547506295144558,
-0.23187100887298584,
0.5645169615745544,
0.0018880653660744429,
-0.47825363278388977,
-0.7481122016906738,
-0.43917906284332275,
0.221680060... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_ZoidBB__unraveled-7b-a1_public | open-llm-leaderboard | 2023-11-23T19:26:40Z | 0 | 0 | null | [
"region:us"
] | 2023-11-23T19:26:40Z | 2023-11-23T19:25:54.000Z | 2023-11-23T19:25:54 | ---
pretty_name: Evaluation run of ZoidBB/unraveled-7b-a1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ZoidBB/unraveled-7b-a1](https://huggingface.co/ZoidBB/unraveled-7b-a1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ZoidBB__unraveled-7b-a1_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-23T19:22:53.071269](https://huggingface.co/datasets/open-llm-leaderboard/details_ZoidBB__unraveled-7b-a1_public/blob/main/results_2023-11-23T19-22-53.071269.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.627016601295214,\n\
\ \"acc_stderr\": 0.03235619334922418,\n \"acc_norm\": 0.6365794887378388,\n\
\ \"acc_norm_stderr\": 0.03306927475699416,\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.42228384526614654,\n\
\ \"mc2_stderr\": 0.014152177395393957,\n \"em\": 0.0017827181208053692,\n\
\ \"em_stderr\": 0.00043200973460387867,\n \"f1\": 0.06056837248322149,\n\
\ \"f1_stderr\": 0.0013671084143061485\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5648464163822525,\n \"acc_stderr\": 0.014487986197186045,\n\
\ \"acc_norm\": 0.5981228668941979,\n \"acc_norm_stderr\": 0.014327268614578274\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.635929097789285,\n\
\ \"acc_stderr\": 0.004801852881329739,\n \"acc_norm\": 0.8280223063134834,\n\
\ \"acc_norm_stderr\": 0.0037658983649388736\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.039105257528497236,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.039105257528497236\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7580645161290323,\n \"acc_stderr\": 0.024362599693031096,\n \"\
acc_norm\": 0.7580645161290323,\n \"acc_norm_stderr\": 0.024362599693031096\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n \"acc_norm\"\
: 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206865,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206865\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.039439666991836285,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.039439666991836285\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8073394495412844,\n \"acc_stderr\": 0.016909276884936073,\n \"\
acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.016909276884936073\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5787037037037037,\n \"acc_stderr\": 0.033674621388960775,\n \"\
acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.033674621388960775\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.0286265479124374,\n \"acc_norm\"\
: 0.7892156862745098,\n \"acc_norm_stderr\": 0.0286265479124374\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676173,\n \"\
acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.04373313040914761,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.04373313040914761\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911901,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911901\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.013890862162876163,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.013890862162876163\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247333,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247333\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3039106145251397,\n\
\ \"acc_stderr\": 0.015382845587584517,\n \"acc_norm\": 0.3039106145251397,\n\
\ \"acc_norm_stderr\": 0.015382845587584517\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464485,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464485\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n\
\ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n\
\ \"acc_stderr\": 0.012685906538206247,\n \"acc_norm\": 0.4426336375488918,\n\
\ \"acc_norm_stderr\": 0.012685906538206247\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6437908496732027,\n \"acc_stderr\": 0.0193733324207245,\n \
\ \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.0193733324207245\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.42228384526614654,\n\
\ \"mc2_stderr\": 0.014152177395393957\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.011793015817663592\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0017827181208053692,\n \
\ \"em_stderr\": 0.00043200973460387867,\n \"f1\": 0.06056837248322149,\n\
\ \"f1_stderr\": 0.0013671084143061485\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.14329037149355572,\n \"acc_stderr\": 0.009650895723357585\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ZoidBB/unraveled-7b-a1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|arc:challenge|25_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|drop|3_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|gsm8k|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hellaswag|10_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|winogrande|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-23T19-22-53.071269.parquet'
- config_name: results
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- results_2023-11-23T19-22-53.071269.parquet
- split: latest
path:
- results_2023-11-23T19-22-53.071269.parquet
---
# Dataset Card for Evaluation run of ZoidBB/unraveled-7b-a1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ZoidBB/unraveled-7b-a1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ZoidBB/unraveled-7b-a1](https://huggingface.co/ZoidBB/unraveled-7b-a1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ZoidBB__unraveled-7b-a1_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-23T19:22:53.071269](https://huggingface.co/datasets/open-llm-leaderboard/details_ZoidBB__unraveled-7b-a1_public/blob/main/results_2023-11-23T19-22-53.071269.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.627016601295214,
"acc_stderr": 0.03235619334922418,
"acc_norm": 0.6365794887378388,
"acc_norm_stderr": 0.03306927475699416,
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.42228384526614654,
"mc2_stderr": 0.014152177395393957,
"em": 0.0017827181208053692,
"em_stderr": 0.00043200973460387867,
"f1": 0.06056837248322149,
"f1_stderr": 0.0013671084143061485
},
"harness|arc:challenge|25": {
"acc": 0.5648464163822525,
"acc_stderr": 0.014487986197186045,
"acc_norm": 0.5981228668941979,
"acc_norm_stderr": 0.014327268614578274
},
"harness|hellaswag|10": {
"acc": 0.635929097789285,
"acc_stderr": 0.004801852881329739,
"acc_norm": 0.8280223063134834,
"acc_norm_stderr": 0.0037658983649388736
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.039105257528497236,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.039105257528497236
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031096,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031096
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.029318203645206865,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.029318203645206865
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.039439666991836285,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.039439666991836285
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.016909276884936073,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.016909276884936073
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.033674621388960775,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.033674621388960775
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.0286265479124374,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.0286265479124374
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676173,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.04373313040914761,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.04373313040914761
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.03157065078911901,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.03157065078911901
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876163,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876163
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247333,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247333
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3039106145251397,
"acc_stderr": 0.015382845587584517,
"acc_norm": 0.3039106145251397,
"acc_norm_stderr": 0.015382845587584517
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464485,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464485
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4426336375488918,
"acc_stderr": 0.012685906538206247,
"acc_norm": 0.4426336375488918,
"acc_norm_stderr": 0.012685906538206247
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.0193733324207245,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.0193733324207245
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482706,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.42228384526614654,
"mc2_stderr": 0.014152177395393957
},
"harness|winogrande|5": {
"acc": 0.7719021310181531,
"acc_stderr": 0.011793015817663592
},
"harness|drop|3": {
"em": 0.0017827181208053692,
"em_stderr": 0.00043200973460387867,
"f1": 0.06056837248322149,
"f1_stderr": 0.0013671084143061485
},
"harness|gsm8k|5": {
"acc": 0.14329037149355572,
"acc_stderr": 0.009650895723357585
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7231050729751587,
-0.8496382832527161,
0.2822355329990387,
0.179800882935524,
-0.2434827834367752,
0.014479683712124825,
0.003922248259186745,
-0.21841013431549072,
0.5675956010818481,
-0.03401647135615349,
-0.5000839829444885,
-0.7397248148918152,
-0.43622973561286926,
0.22679346799850... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Vettor/cabelinho | Vettor | 2023-11-23T19:35:58Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-23T19:35:58Z | 2023-11-23T19:35:09.000Z | 2023-11-23T19:35:09 | ---
license: openrail
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
HK83/real_people_3000 | HK83 | 2023-11-23T19:47:14Z | 0 | 0 | null | [
"license:afl-3.0",
"region:us"
] | 2023-11-23T19:47:14Z | 2023-11-23T19:46:55.000Z | 2023-11-23T19:46:55 | ---
license: afl-3.0
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
MAST0ERLU/NER_TRAIN | MAST0ERLU | 2023-11-23T19:51:45Z | 0 | 0 | null | [
"region:us"
] | 2023-11-23T19:51:45Z | 2023-11-23T19:50:36.000Z | 2023-11-23T19:50:36 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622264862060547,
0.43461528420448303,
-0.52829909324646,
0.7012971639633179,
0.7915720343589783,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104477167129517,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
AppleHarem/kokona_bluearchive | AppleHarem | 2023-11-23T20:03:06Z | 0 | 0 | null | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2023-11-23T20:03:06Z | 2023-11-23T20:02:51.000Z | 2023-11-23T20:02:51 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kokona (Blue Archive)
This is the dataset of kokona (Blue Archive), containing 150 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 150 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 416 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 505 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 150 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 150 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 150 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 416 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 416 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 390 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 505 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 505 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| [
-0.7543781995773315,
-0.34440135955810547,
0.5308934450149536,
0.13622184097766876,
-0.3552059531211853,
-0.11026734858751297,
0.1437668800354004,
-0.5461111068725586,
0.7508409023284912,
0.7169153690338135,
-0.752191424369812,
-0.7980547547340393,
-0.5602620840072632,
0.3701719343662262,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
tmbdev/d-tokens | tmbdev | 2023-11-23T20:06:41Z | 0 | 0 | null | [
"region:us"
] | 2023-11-23T20:06:41Z | 2023-11-23T20:06:41.000Z | 2023-11-23T20:06:41 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622264862060547,
0.43461528420448303,
-0.52829909324646,
0.7012971639633179,
0.7915720343589783,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104477167129517,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
luvvz/sa | luvvz | 2023-11-23T20:37:15Z | 0 | 0 | null | [
"region:us"
] | 2023-11-23T20:37:15Z | 2023-11-23T20:37:15.000Z | 2023-11-23T20:37:15 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622264862060547,
0.43461528420448303,
-0.52829909324646,
0.7012971639633179,
0.7915720343589783,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104477167129517,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
joaosanches/subtitles_pt_br_adventure | joaosanches | 2023-11-23T21:00:39Z | 0 | 0 | null | [
"region:us"
] | 2023-11-23T21:00:39Z | 2023-11-23T20:38:21.000Z | 2023-11-23T20:38:21 | ---
dataset_info:
features:
- name: translation
dtype: string
splits:
- name: train
num_bytes: 282
num_examples: 26
download_size: 0
dataset_size: 282
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "subtitles_pt_br_adventure"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.6134083867073059,
-0.2107919305562973,
0.01981610804796219,
0.43149006366729736,
-0.5187259316444397,
0.1951366513967514,
0.08719523251056671,
0.1114698275923729,
0.7594782114028931,
0.4999818801879883,
-1.0702061653137207,
-0.7914994955062866,
-0.6393857598304749,
-0.10571517795324326,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_jb723__cross_lingual_epoch2_public | open-llm-leaderboard | 2023-11-23T20:45:58Z | 0 | 0 | null | [
"region:us"
] | 2023-11-23T20:45:58Z | 2023-11-23T20:45:13.000Z | 2023-11-23T20:45:13 | ---
pretty_name: Evaluation run of jb723/cross_lingual_epoch2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jb723/cross_lingual_epoch2](https://huggingface.co/jb723/cross_lingual_epoch2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jb723__cross_lingual_epoch2_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-23T20:42:48.019981](https://huggingface.co/datasets/open-llm-leaderboard/details_jb723__cross_lingual_epoch2_public/blob/main/results_2023-11-23T20-42-48.019981.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.36365814487208015,\n\
\ \"acc_stderr\": 0.033533768466394574,\n \"acc_norm\": 0.36889909845181407,\n\
\ \"acc_norm_stderr\": 0.03445291913417296,\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.01489627744104185,\n \"mc2\": 0.4789867119861502,\n\
\ \"mc2_stderr\": 0.016540775343672782,\n \"em\": 0.049601510067114093,\n\
\ \"em_stderr\": 0.0022235145171999363,\n \"f1\": 0.07294463087248305,\n\
\ \"f1_stderr\": 0.002421427712218101\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3293515358361775,\n \"acc_stderr\": 0.013734057652635474,\n\
\ \"acc_norm\": 0.3924914675767918,\n \"acc_norm_stderr\": 0.014269634635670714\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3392750448117905,\n\
\ \"acc_stderr\": 0.004724956665879986,\n \"acc_norm\": 0.47918741286596295,\n\
\ \"acc_norm_stderr\": 0.004985456752161\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.04256193767901407,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.04256193767901407\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3026315789473684,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.3026315789473684,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4339622641509434,\n \"acc_stderr\": 0.030503292013342596,\n\
\ \"acc_norm\": 0.4339622641509434,\n \"acc_norm_stderr\": 0.030503292013342596\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3583815028901734,\n\
\ \"acc_stderr\": 0.03656343653353158,\n \"acc_norm\": 0.3583815028901734,\n\
\ \"acc_norm_stderr\": 0.03656343653353158\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.48,\n \"acc_stderr\": 0.05021167315686781,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.05021167315686781\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.030363582197238167,\n\
\ \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.030363582197238167\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843671,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843671\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.38620689655172413,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.38620689655172413,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206824,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206824\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4032258064516129,\n \"acc_stderr\": 0.027906150826041143,\n \"\
acc_norm\": 0.4032258064516129,\n \"acc_norm_stderr\": 0.027906150826041143\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.24630541871921183,\n \"acc_stderr\": 0.03031509928561773,\n \"\
acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.03031509928561773\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3696969696969697,\n \"acc_stderr\": 0.03769430314512568,\n\
\ \"acc_norm\": 0.3696969696969697,\n \"acc_norm_stderr\": 0.03769430314512568\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3787878787878788,\n \"acc_stderr\": 0.03456088731993747,\n \"\
acc_norm\": 0.3787878787878788,\n \"acc_norm_stderr\": 0.03456088731993747\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.49222797927461137,\n \"acc_stderr\": 0.03608003225569653,\n\
\ \"acc_norm\": 0.49222797927461137,\n \"acc_norm_stderr\": 0.03608003225569653\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2923076923076923,\n \"acc_stderr\": 0.023060438380857744,\n\
\ \"acc_norm\": 0.2923076923076923,\n \"acc_norm_stderr\": 0.023060438380857744\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2074074074074074,\n \"acc_stderr\": 0.02472071319395216,\n \
\ \"acc_norm\": 0.2074074074074074,\n \"acc_norm_stderr\": 0.02472071319395216\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36134453781512604,\n \"acc_stderr\": 0.031204691225150023,\n\
\ \"acc_norm\": 0.36134453781512604,\n \"acc_norm_stderr\": 0.031204691225150023\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.0347918557259966,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.0347918557259966\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.43853211009174314,\n \"acc_stderr\": 0.021274713073954572,\n \"\
acc_norm\": 0.43853211009174314,\n \"acc_norm_stderr\": 0.021274713073954572\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.18981481481481483,\n \"acc_stderr\": 0.026744714834691926,\n \"\
acc_norm\": 0.18981481481481483,\n \"acc_norm_stderr\": 0.026744714834691926\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.37745098039215685,\n \"acc_stderr\": 0.03402272044340703,\n \"\
acc_norm\": 0.37745098039215685,\n \"acc_norm_stderr\": 0.03402272044340703\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4388185654008439,\n \"acc_stderr\": 0.032302649315470375,\n \
\ \"acc_norm\": 0.4388185654008439,\n \"acc_norm_stderr\": 0.032302649315470375\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.49327354260089684,\n\
\ \"acc_stderr\": 0.03355476596234354,\n \"acc_norm\": 0.49327354260089684,\n\
\ \"acc_norm_stderr\": 0.03355476596234354\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.40458015267175573,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.40458015267175573,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5206611570247934,\n \"acc_stderr\": 0.04560456086387235,\n \"\
acc_norm\": 0.5206611570247934,\n \"acc_norm_stderr\": 0.04560456086387235\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04803752235190193,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04803752235190193\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3558282208588957,\n \"acc_stderr\": 0.03761521380046734,\n\
\ \"acc_norm\": 0.3558282208588957,\n \"acc_norm_stderr\": 0.03761521380046734\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.36893203883495146,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.36893203883495146,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6196581196581197,\n\
\ \"acc_stderr\": 0.031804252043840985,\n \"acc_norm\": 0.6196581196581197,\n\
\ \"acc_norm_stderr\": 0.031804252043840985\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4725415070242657,\n\
\ \"acc_stderr\": 0.017852981266633955,\n \"acc_norm\": 0.4725415070242657,\n\
\ \"acc_norm_stderr\": 0.017852981266633955\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4046242774566474,\n \"acc_stderr\": 0.02642481659400985,\n\
\ \"acc_norm\": 0.4046242774566474,\n \"acc_norm_stderr\": 0.02642481659400985\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.42810457516339867,\n \"acc_stderr\": 0.028332397483664274,\n\
\ \"acc_norm\": 0.42810457516339867,\n \"acc_norm_stderr\": 0.028332397483664274\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4758842443729904,\n\
\ \"acc_stderr\": 0.028365041542564584,\n \"acc_norm\": 0.4758842443729904,\n\
\ \"acc_norm_stderr\": 0.028365041542564584\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4537037037037037,\n \"acc_stderr\": 0.027701228468542602,\n\
\ \"acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.027701228468542602\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2765957446808511,\n \"acc_stderr\": 0.026684564340461,\n \
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.026684564340461\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2985658409387223,\n\
\ \"acc_stderr\": 0.011688060141794228,\n \"acc_norm\": 0.2985658409387223,\n\
\ \"acc_norm_stderr\": 0.011688060141794228\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.25735294117647056,\n \"acc_stderr\": 0.026556519470041506,\n\
\ \"acc_norm\": 0.25735294117647056,\n \"acc_norm_stderr\": 0.026556519470041506\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.369281045751634,\n \"acc_stderr\": 0.019524316744866356,\n \
\ \"acc_norm\": 0.369281045751634,\n \"acc_norm_stderr\": 0.019524316744866356\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4636363636363636,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.4636363636363636,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4122448979591837,\n \"acc_stderr\": 0.03151236044674281,\n\
\ \"acc_norm\": 0.4122448979591837,\n \"acc_norm_stderr\": 0.03151236044674281\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.472636815920398,\n\
\ \"acc_stderr\": 0.03530235517334682,\n \"acc_norm\": 0.472636815920398,\n\
\ \"acc_norm_stderr\": 0.03530235517334682\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.03789134424611548,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.03789134424611548\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.038057975055904594,\n\
\ \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.038057975055904594\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.01489627744104185,\n \"mc2\": 0.4789867119861502,\n\
\ \"mc2_stderr\": 0.016540775343672782\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6211523283346487,\n \"acc_stderr\": 0.013633724603180335\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.049601510067114093,\n \
\ \"em_stderr\": 0.0022235145171999363,\n \"f1\": 0.07294463087248305,\n\
\ \"f1_stderr\": 0.002421427712218101\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/jb723/cross_lingual_epoch2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|arc:challenge|25_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|drop|3_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|gsm8k|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hellaswag|10_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T20-42-48.019981.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T20-42-48.019981.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- '**/details_harness|winogrande|5_2023-11-23T20-42-48.019981.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-23T20-42-48.019981.parquet'
- config_name: results
data_files:
- split: 2023_11_23T20_42_48.019981
path:
- results_2023-11-23T20-42-48.019981.parquet
- split: latest
path:
- results_2023-11-23T20-42-48.019981.parquet
---
# Dataset Card for Evaluation run of jb723/cross_lingual_epoch2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jb723/cross_lingual_epoch2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jb723/cross_lingual_epoch2](https://huggingface.co/jb723/cross_lingual_epoch2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jb723__cross_lingual_epoch2_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-23T20:42:48.019981](https://huggingface.co/datasets/open-llm-leaderboard/details_jb723__cross_lingual_epoch2_public/blob/main/results_2023-11-23T20-42-48.019981.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.36365814487208015,
"acc_stderr": 0.033533768466394574,
"acc_norm": 0.36889909845181407,
"acc_norm_stderr": 0.03445291913417296,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.01489627744104185,
"mc2": 0.4789867119861502,
"mc2_stderr": 0.016540775343672782,
"em": 0.049601510067114093,
"em_stderr": 0.0022235145171999363,
"f1": 0.07294463087248305,
"f1_stderr": 0.002421427712218101
},
"harness|arc:challenge|25": {
"acc": 0.3293515358361775,
"acc_stderr": 0.013734057652635474,
"acc_norm": 0.3924914675767918,
"acc_norm_stderr": 0.014269634635670714
},
"harness|hellaswag|10": {
"acc": 0.3392750448117905,
"acc_stderr": 0.004724956665879986,
"acc_norm": 0.47918741286596295,
"acc_norm_stderr": 0.004985456752161
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.04256193767901407,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.04256193767901407
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3026315789473684,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.3026315789473684,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4339622641509434,
"acc_stderr": 0.030503292013342596,
"acc_norm": 0.4339622641509434,
"acc_norm_stderr": 0.030503292013342596
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3125,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3583815028901734,
"acc_stderr": 0.03656343653353158,
"acc_norm": 0.3583815028901734,
"acc_norm_stderr": 0.03656343653353158
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686781,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686781
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3148936170212766,
"acc_stderr": 0.030363582197238167,
"acc_norm": 0.3148936170212766,
"acc_norm_stderr": 0.030363582197238167
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843671,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843671
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.38620689655172413,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.38620689655172413,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206824,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206824
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4032258064516129,
"acc_stderr": 0.027906150826041143,
"acc_norm": 0.4032258064516129,
"acc_norm_stderr": 0.027906150826041143
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.24630541871921183,
"acc_stderr": 0.03031509928561773,
"acc_norm": 0.24630541871921183,
"acc_norm_stderr": 0.03031509928561773
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3696969696969697,
"acc_stderr": 0.03769430314512568,
"acc_norm": 0.3696969696969697,
"acc_norm_stderr": 0.03769430314512568
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3787878787878788,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.3787878787878788,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.49222797927461137,
"acc_stderr": 0.03608003225569653,
"acc_norm": 0.49222797927461137,
"acc_norm_stderr": 0.03608003225569653
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2923076923076923,
"acc_stderr": 0.023060438380857744,
"acc_norm": 0.2923076923076923,
"acc_norm_stderr": 0.023060438380857744
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.02472071319395216,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.02472071319395216
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36134453781512604,
"acc_stderr": 0.031204691225150023,
"acc_norm": 0.36134453781512604,
"acc_norm_stderr": 0.031204691225150023
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.0347918557259966,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.0347918557259966
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.43853211009174314,
"acc_stderr": 0.021274713073954572,
"acc_norm": 0.43853211009174314,
"acc_norm_stderr": 0.021274713073954572
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18981481481481483,
"acc_stderr": 0.026744714834691926,
"acc_norm": 0.18981481481481483,
"acc_norm_stderr": 0.026744714834691926
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.37745098039215685,
"acc_stderr": 0.03402272044340703,
"acc_norm": 0.37745098039215685,
"acc_norm_stderr": 0.03402272044340703
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4388185654008439,
"acc_stderr": 0.032302649315470375,
"acc_norm": 0.4388185654008439,
"acc_norm_stderr": 0.032302649315470375
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.49327354260089684,
"acc_stderr": 0.03355476596234354,
"acc_norm": 0.49327354260089684,
"acc_norm_stderr": 0.03355476596234354
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.40458015267175573,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.40458015267175573,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5206611570247934,
"acc_stderr": 0.04560456086387235,
"acc_norm": 0.5206611570247934,
"acc_norm_stderr": 0.04560456086387235
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04803752235190193,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04803752235190193
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3558282208588957,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.3558282208588957,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.36893203883495146,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.36893203883495146,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6196581196581197,
"acc_stderr": 0.031804252043840985,
"acc_norm": 0.6196581196581197,
"acc_norm_stderr": 0.031804252043840985
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4725415070242657,
"acc_stderr": 0.017852981266633955,
"acc_norm": 0.4725415070242657,
"acc_norm_stderr": 0.017852981266633955
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4046242774566474,
"acc_stderr": 0.02642481659400985,
"acc_norm": 0.4046242774566474,
"acc_norm_stderr": 0.02642481659400985
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.42810457516339867,
"acc_stderr": 0.028332397483664274,
"acc_norm": 0.42810457516339867,
"acc_norm_stderr": 0.028332397483664274
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4758842443729904,
"acc_stderr": 0.028365041542564584,
"acc_norm": 0.4758842443729904,
"acc_norm_stderr": 0.028365041542564584
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.027701228468542602,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.027701228468542602
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.026684564340461,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.026684564340461
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2985658409387223,
"acc_stderr": 0.011688060141794228,
"acc_norm": 0.2985658409387223,
"acc_norm_stderr": 0.011688060141794228
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.25735294117647056,
"acc_stderr": 0.026556519470041506,
"acc_norm": 0.25735294117647056,
"acc_norm_stderr": 0.026556519470041506
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.369281045751634,
"acc_stderr": 0.019524316744866356,
"acc_norm": 0.369281045751634,
"acc_norm_stderr": 0.019524316744866356
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4636363636363636,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.4636363636363636,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4122448979591837,
"acc_stderr": 0.03151236044674281,
"acc_norm": 0.4122448979591837,
"acc_norm_stderr": 0.03151236044674281
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.472636815920398,
"acc_stderr": 0.03530235517334682,
"acc_norm": 0.472636815920398,
"acc_norm_stderr": 0.03530235517334682
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.03789134424611548,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.03789134424611548
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.038057975055904594,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.038057975055904594
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.01489627744104185,
"mc2": 0.4789867119861502,
"mc2_stderr": 0.016540775343672782
},
"harness|winogrande|5": {
"acc": 0.6211523283346487,
"acc_stderr": 0.013633724603180335
},
"harness|drop|3": {
"em": 0.049601510067114093,
"em_stderr": 0.0022235145171999363,
"f1": 0.07294463087248305,
"f1_stderr": 0.002421427712218101
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7313991785049438,
-0.8749458193778992,
0.25884681940078735,
0.23476167023181915,
-0.16975544393062592,
-0.061196859925985336,
-0.011204537004232407,
-0.2569710910320282,
0.5734401941299438,
-0.029568685218691826,
-0.43989455699920654,
-0.6895619630813599,
-0.42749395966529846,
0.2377474... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
johko/dalle3-eval-samples | johko | 2023-11-23T21:11:27Z | 0 | 0 | null | [
"region:us"
] | 2023-11-23T21:11:27Z | 2023-11-23T20:58:21.000Z | 2023-11-23T20:58:21 | # DALL-E 3 Evaluation Samples
This repository contains text-to-image samples collected for the evaluations of DALL-E 3 in the whitepaper. We provide samples not only from DALL-E 3, but from the competitors we compare against in the paper.
The intent of this repository is to enable researchers in the text-to-image space to reproduce our results and foster forward progress of the text-to-image field as a whole. The samples from this repository are *not* meant to be demonstrations of the DALL-E 3 system.
## Structure
There are six directories in this repository:
### coco
Contains ~32,000 samples from each model derived from ~8,000 captions from the MSCOCO 2014 evaluation set. These samples are intended to be used for CLIP score calculation.
### drawbench
Contains 4 samples for each prompt from the [drawbench dataset](https://imagen.research.google/) for each model. In the paper, we evaluate these samples using GPT-4 with Vision and using human raters.
### drawbench_upsampled
Contains 4 samples for each prompt in our upsampled drawbench dataset, which was derived using the caption upsampling methodology described in the paper. We evaluate these samples using GPT-4 with Vision.
### prompts
Contains the prompts used to generate all of the samples in the other directories. Prompt files are simple text files. The order of the prompts in these files corresponds with the order of the respective image samples.
### t2i_compbench
Contains 4 samples for each prompt in the [T2I CompBench evaluation](https://github.com/Karine-Huang/T2I-CompBench). We use the scripts provided with that evaluation to measure the performance of the models in our comparison.
| [
-0.4791021943092346,
-0.44895216822624207,
0.6316491365432739,
0.3988693356513977,
-0.34376832842826843,
0.20710976421833038,
0.06958713382482529,
-0.6522971391677856,
-0.06679881364107132,
0.5608765482902527,
-0.4904213547706604,
-0.8134812116622925,
-0.4280780255794525,
0.242239475250244... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
felipegallo1/felipegallo1 | felipegallo1 | 2023-11-23T21:54:34Z | 0 | 0 | null | [
"region:us"
] | 2023-11-23T21:54:34Z | 2023-11-23T21:26:24.000Z | 2023-11-23T21:26:24 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Ritori/notebook1 | Ritori | 2023-11-23T21:48:36Z | 0 | 0 | null | [
"region:us"
] | 2023-11-23T21:48:36Z | 2023-11-23T21:46:59.000Z | 2023-11-23T21:46:59 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
yvelos/BBQ | yvelos | 2023-11-23T22:09:44Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-23T22:09:44Z | 2023-11-23T22:09:44.000Z | 2023-11-23T22:09:44 | ---
license: mit
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Buggy23/joaogomes | Buggy23 | 2023-11-23T23:06:36Z | 0 | 0 | null | [
"region:us"
] | 2023-11-23T23:06:36Z | 2023-11-23T22:36:06.000Z | 2023-11-23T22:36:06 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
realjcz/test | realjcz | 2023-11-23T22:41:11Z | 0 | 0 | null | [
"region:us"
] | 2023-11-23T22:41:11Z | 2023-11-23T22:41:11.000Z | 2023-11-23T22:41:11 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
petrpan26/hf-codegen-v2 | petrpan26 | 2023-11-23T22:56:09Z | 0 | 0 | null | [
"region:us"
] | 2023-11-23T22:56:09Z | 2023-11-23T22:56:09.000Z | 2023-11-23T22:56:09 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_uukuguy__SynthIA-7B-v1.3-dare-0.85_public | open-llm-leaderboard | 2023-11-23T23:03:43Z | 0 | 0 | null | [
"region:us"
] | 2023-11-23T23:03:43Z | 2023-11-23T23:02:57.000Z | 2023-11-23T23:02:57 | ---
pretty_name: Evaluation run of uukuguy/SynthIA-7B-v1.3-dare-0.85
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/SynthIA-7B-v1.3-dare-0.85](https://huggingface.co/uukuguy/SynthIA-7B-v1.3-dare-0.85)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__SynthIA-7B-v1.3-dare-0.85_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-23T22:59:57.395887](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__SynthIA-7B-v1.3-dare-0.85_public/blob/main/results_2023-11-23T22-59-57.395887.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6384101997004026,\n\
\ \"acc_stderr\": 0.0320658451939497,\n \"acc_norm\": 0.6475312994622042,\n\
\ \"acc_norm_stderr\": 0.032755008534067175,\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361,\n \"mc2\": 0.4377418572010016,\n\
\ \"mc2_stderr\": 0.014257418960086683,\n \"em\": 0.0018875838926174498,\n\
\ \"em_stderr\": 0.0004445109990558977,\n \"f1\": 0.06350356543624144,\n\
\ \"f1_stderr\": 0.0013999691906909637\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5750853242320819,\n \"acc_stderr\": 0.014445698968520769,\n\
\ \"acc_norm\": 0.6100682593856656,\n \"acc_norm_stderr\": 0.014252959848892893\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6336387173869747,\n\
\ \"acc_stderr\": 0.004808251269682433,\n \"acc_norm\": 0.8349930292770364,\n\
\ \"acc_norm_stderr\": 0.00370428239078172\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337135,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337135\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383887,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383887\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.0252798503974049,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.0252798503974049\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8146788990825689,\n \"acc_stderr\": 0.01665927970029584,\n \"\
acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.01665927970029584\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.033922384053216174,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.033922384053216174\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406953,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406953\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294407006,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294407006\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.02410571260775431,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.02410571260775431\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34972067039106147,\n\
\ \"acc_stderr\": 0.015949308790233645,\n \"acc_norm\": 0.34972067039106147,\n\
\ \"acc_norm_stderr\": 0.015949308790233645\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4485006518904824,\n\
\ \"acc_stderr\": 0.012702317490559802,\n \"acc_norm\": 0.4485006518904824,\n\
\ \"acc_norm_stderr\": 0.012702317490559802\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.03878626771002361,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.03878626771002361\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361,\n \"mc2\": 0.4377418572010016,\n\
\ \"mc2_stderr\": 0.014257418960086683\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7892659826361483,\n \"acc_stderr\": 0.011462046419710686\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0018875838926174498,\n \
\ \"em_stderr\": 0.0004445109990558977,\n \"f1\": 0.06350356543624144,\n\
\ \"f1_stderr\": 0.0013999691906909637\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.18574677786201668,\n \"acc_stderr\": 0.010712298902729095\n\
\ }\n}\n```"
repo_url: https://huggingface.co/uukuguy/SynthIA-7B-v1.3-dare-0.85
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|arc:challenge|25_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|drop|3_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|gsm8k|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hellaswag|10_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|winogrande|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-23T22-59-57.395887.parquet'
- config_name: results
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- results_2023-11-23T22-59-57.395887.parquet
- split: latest
path:
- results_2023-11-23T22-59-57.395887.parquet
---
# Dataset Card for Evaluation run of uukuguy/SynthIA-7B-v1.3-dare-0.85
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/SynthIA-7B-v1.3-dare-0.85
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/SynthIA-7B-v1.3-dare-0.85](https://huggingface.co/uukuguy/SynthIA-7B-v1.3-dare-0.85) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__SynthIA-7B-v1.3-dare-0.85_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-23T22:59:57.395887](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__SynthIA-7B-v1.3-dare-0.85_public/blob/main/results_2023-11-23T22-59-57.395887.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6384101997004026,
"acc_stderr": 0.0320658451939497,
"acc_norm": 0.6475312994622042,
"acc_norm_stderr": 0.032755008534067175,
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361,
"mc2": 0.4377418572010016,
"mc2_stderr": 0.014257418960086683,
"em": 0.0018875838926174498,
"em_stderr": 0.0004445109990558977,
"f1": 0.06350356543624144,
"f1_stderr": 0.0013999691906909637
},
"harness|arc:challenge|25": {
"acc": 0.5750853242320819,
"acc_stderr": 0.014445698968520769,
"acc_norm": 0.6100682593856656,
"acc_norm_stderr": 0.014252959848892893
},
"harness|hellaswag|10": {
"acc": 0.6336387173869747,
"acc_stderr": 0.004808251269682433,
"acc_norm": 0.8349930292770364,
"acc_norm_stderr": 0.00370428239078172
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337135,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337135
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383887,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383887
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.0252798503974049,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.0252798503974049
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131147,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131147
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.01665927970029584,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.01665927970029584
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.033922384053216174,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.033922384053216174
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.0306365913486998,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.0306365913486998
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406953,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406953
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407006,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407006
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.02410571260775431,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.02410571260775431
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34972067039106147,
"acc_stderr": 0.015949308790233645,
"acc_norm": 0.34972067039106147,
"acc_norm_stderr": 0.015949308790233645
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4485006518904824,
"acc_stderr": 0.012702317490559802,
"acc_norm": 0.4485006518904824,
"acc_norm_stderr": 0.012702317490559802
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827072,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827072
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.03878626771002361,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.03878626771002361
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361,
"mc2": 0.4377418572010016,
"mc2_stderr": 0.014257418960086683
},
"harness|winogrande|5": {
"acc": 0.7892659826361483,
"acc_stderr": 0.011462046419710686
},
"harness|drop|3": {
"em": 0.0018875838926174498,
"em_stderr": 0.0004445109990558977,
"f1": 0.06350356543624144,
"f1_stderr": 0.0013999691906909637
},
"harness|gsm8k|5": {
"acc": 0.18574677786201668,
"acc_stderr": 0.010712298902729095
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.6907168626785278,
-0.8436826467514038,
0.3153219521045685,
0.22387681901454926,
-0.2002798467874527,
-0.0780574306845665,
0.025687048211693764,
-0.2053651511669159,
0.5629164576530457,
-0.02750924415886402,
-0.49065840244293213,
-0.738111138343811,
-0.40265366435050964,
0.22525185346603... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_uukuguy__airoboros-m-7b-3.1.2-dare-0.85_public | open-llm-leaderboard | 2023-11-23T23:07:53Z | 0 | 0 | null | [
"region:us"
] | 2023-11-23T23:07:53Z | 2023-11-23T23:07:08.000Z | 2023-11-23T23:07:08 | ---
pretty_name: Evaluation run of uukuguy/airoboros-m-7b-3.1.2-dare-0.85
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/airoboros-m-7b-3.1.2-dare-0.85](https://huggingface.co/uukuguy/airoboros-m-7b-3.1.2-dare-0.85)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__airoboros-m-7b-3.1.2-dare-0.85_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-23T23:04:08.316762](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__airoboros-m-7b-3.1.2-dare-0.85_public/blob/main/results_2023-11-23T23-04-08.316762.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.634042190068382,\n\
\ \"acc_stderr\": 0.032283010718078695,\n \"acc_norm\": 0.6433235552057146,\n\
\ \"acc_norm_stderr\": 0.03298195134123534,\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.43638896018594414,\n\
\ \"mc2_stderr\": 0.01419131146424957,\n \"em\": 0.0016778523489932886,\n\
\ \"em_stderr\": 0.00041913301788268467,\n \"f1\": 0.06136954697986581,\n\
\ \"f1_stderr\": 0.0013699074965009578\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.575938566552901,\n \"acc_stderr\": 0.0144418896274644,\n\
\ \"acc_norm\": 0.6109215017064846,\n \"acc_norm_stderr\": 0.014247309976045607\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6330412268472416,\n\
\ \"acc_stderr\": 0.00480990115123484,\n \"acc_norm\": 0.8356901015733917,\n\
\ \"acc_norm_stderr\": 0.003697992356124477\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3862433862433862,\n \"acc_stderr\": 0.02507598176760168,\n \"\
acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.02507598176760168\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.0436031486007746,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.0436031486007746\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\"\
: 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8201834862385321,\n \"acc_stderr\": 0.016465345467391545,\n \"\
acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.016465345467391545\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001505,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001505\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3027932960893855,\n\
\ \"acc_stderr\": 0.01536686038639711,\n \"acc_norm\": 0.3027932960893855,\n\
\ \"acc_norm_stderr\": 0.01536686038639711\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340863,\n\
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340863\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4471968709256845,\n\
\ \"acc_stderr\": 0.012698825252435108,\n \"acc_norm\": 0.4471968709256845,\n\
\ \"acc_norm_stderr\": 0.012698825252435108\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406752,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406752\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.02740385941078685,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.02740385941078685\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.036845294917747115,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.036845294917747115\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.43638896018594414,\n\
\ \"mc2_stderr\": 0.01419131146424957\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.01157061486140935\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0016778523489932886,\n \
\ \"em_stderr\": 0.00041913301788268467,\n \"f1\": 0.06136954697986581,\n\
\ \"f1_stderr\": 0.0013699074965009578\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.17437452615617893,\n \"acc_stderr\": 0.010451421361976233\n\
\ }\n}\n```"
repo_url: https://huggingface.co/uukuguy/airoboros-m-7b-3.1.2-dare-0.85
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|arc:challenge|25_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|drop|3_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|gsm8k|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hellaswag|10_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T23-04-08.316762.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T23-04-08.316762.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- '**/details_harness|winogrande|5_2023-11-23T23-04-08.316762.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-23T23-04-08.316762.parquet'
- config_name: results
data_files:
- split: 2023_11_23T23_04_08.316762
path:
- results_2023-11-23T23-04-08.316762.parquet
- split: latest
path:
- results_2023-11-23T23-04-08.316762.parquet
---
# Dataset Card for Evaluation run of uukuguy/airoboros-m-7b-3.1.2-dare-0.85
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/airoboros-m-7b-3.1.2-dare-0.85
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/airoboros-m-7b-3.1.2-dare-0.85](https://huggingface.co/uukuguy/airoboros-m-7b-3.1.2-dare-0.85) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__airoboros-m-7b-3.1.2-dare-0.85_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-23T23:04:08.316762](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__airoboros-m-7b-3.1.2-dare-0.85_public/blob/main/results_2023-11-23T23-04-08.316762.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.634042190068382,
"acc_stderr": 0.032283010718078695,
"acc_norm": 0.6433235552057146,
"acc_norm_stderr": 0.03298195134123534,
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236614,
"mc2": 0.43638896018594414,
"mc2_stderr": 0.01419131146424957,
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788268467,
"f1": 0.06136954697986581,
"f1_stderr": 0.0013699074965009578
},
"harness|arc:challenge|25": {
"acc": 0.575938566552901,
"acc_stderr": 0.0144418896274644,
"acc_norm": 0.6109215017064846,
"acc_norm_stderr": 0.014247309976045607
},
"harness|hellaswag|10": {
"acc": 0.6330412268472416,
"acc_stderr": 0.00480990115123484,
"acc_norm": 0.8356901015733917,
"acc_norm_stderr": 0.003697992356124477
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.02507598176760168,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.02507598176760168
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.0436031486007746,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.0436031486007746
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.016465345467391545,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.016465345467391545
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001505,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001505
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3027932960893855,
"acc_stderr": 0.01536686038639711,
"acc_norm": 0.3027932960893855,
"acc_norm_stderr": 0.01536686038639711
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340863,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340863
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4471968709256845,
"acc_stderr": 0.012698825252435108,
"acc_norm": 0.4471968709256845,
"acc_norm_stderr": 0.012698825252435108
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406752,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406752
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.02740385941078685,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.02740385941078685
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.036845294917747115,
"acc_norm": 0.84,
"acc_norm_stderr": 0.036845294917747115
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236614,
"mc2": 0.43638896018594414,
"mc2_stderr": 0.01419131146424957
},
"harness|winogrande|5": {
"acc": 0.7837411207576953,
"acc_stderr": 0.01157061486140935
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788268467,
"f1": 0.06136954697986581,
"f1_stderr": 0.0013699074965009578
},
"harness|gsm8k|5": {
"acc": 0.17437452615617893,
"acc_stderr": 0.010451421361976233
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.6816958785057068,
-0.8223103880882263,
0.28009262681007385,
0.2170362025499344,
-0.2039659172296524,
-0.08954199403524399,
0.009771019220352173,
-0.1959560215473175,
0.552932858467102,
-0.011062342673540115,
-0.4889267683029175,
-0.7283376455307007,
-0.4319514036178589,
0.19703926146030... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
pabblitu/NFvoiiiice | pabblitu | 2023-11-27T21:49:49Z | 0 | 0 | null | [
"region:us"
] | 2023-11-27T21:49:49Z | 2023-11-23T23:54:38.000Z | 2023-11-23T23:54:38 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
AppleHarem/iroha_bluearchive | AppleHarem | 2023-11-24T00:08:07Z | 0 | 0 | null | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2023-11-24T00:08:07Z | 2023-11-24T00:07:32.000Z | 2023-11-24T00:07:32 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of iroha (Blue Archive)
This is the dataset of iroha (Blue Archive), containing 150 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 150 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 416 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 482 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 150 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 150 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 150 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 416 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 416 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 356 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 482 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 482 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| [
-0.7540107369422913,
-0.35107672214508057,
0.3003793954849243,
0.16482578217983246,
-0.24743950366973877,
-0.19490627944469452,
0.2649960517883301,
-0.6367359757423401,
0.8233993053436279,
0.6609533429145813,
-0.8875887393951416,
-0.6688416004180908,
-0.5479971766471863,
0.3599529266357422... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
migtissera/Hitchhiker | migtissera | 2023-11-27T18:33:51Z | 0 | 1 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-27T18:33:51Z | 2023-11-24T00:09:57.000Z | 2023-11-24T00:09:57 | ---
license: apache-2.0
---
# Hitchhiker's Guide to the Galaxy
GPT-4-Turbo generations to elicit responses modelled on the Hitchhiker's Guide to the Galaxy.
Add some spice to your LLMs. Enjoy!

| [
-0.6981176137924194,
-0.40164512395858765,
0.8146414756774902,
0.33355581760406494,
-0.4135493338108063,
0.17803074419498444,
0.2507385313510895,
-0.16210919618606567,
0.3920751214027405,
0.29022055864334106,
-1.4680472612380981,
-0.046476300805807114,
-0.27112695574760437,
0.2692779600620... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
strifedeeno/kaggle_LLM_dataset | strifedeeno | 2023-11-24T00:24:58Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T00:24:58Z | 2023-11-24T00:24:57.000Z | 2023-11-24T00:24:57 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622264862060547,
0.43461528420448303,
-0.52829909324646,
0.7012971639633179,
0.7915720343589783,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104477167129517,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
mindcrafterjesse/Furries | mindcrafterjesse | 2023-11-24T00:34:13Z | 0 | 0 | null | [
"license:unknown",
"region:us"
] | 2023-11-24T00:34:13Z | 2023-11-24T00:34:13.000Z | 2023-11-24T00:34:13 | ---
license: unknown
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
guirondadada/monvoices | guirondadada | 2023-11-24T00:40:27Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T00:40:27Z | 2023-11-24T00:34:30.000Z | 2023-11-24T00:34:30 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622264862060547,
0.43461528420448303,
-0.52829909324646,
0.7012971639633179,
0.7915720343589783,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104477167129517,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
snats/chico_prompts_generate_story | snats | 2023-11-24T00:50:44Z | 0 | 0 | null | [
"language:es",
"license:cc-by-4.0",
"region:us"
] | 2023-11-24T00:50:44Z | 2023-11-24T00:39:13.000Z | 2023-11-24T00:39:13 | ---
license: cc-by-4.0
language:
- es
---
# Dataset Card for Dataset Name
This dataset provides around 8,000 prompts in Spanish about short stories.
The following is the prompt in english:
```
prompt:
Write a short story based on the following title:
{{titles}}
completion:
{{contents}}
```
In spanish:
```
prompt:
Escribe una historia corta basada en el siguiente título {{titles}}
completion:
{{contents}}
```
# More Information
This dataset is a sub-version of the original [chico dataset](https://huggingface.co/datasets/snats/chico). | [
-0.41923171281814575,
-0.48054948449134827,
0.3479496240615845,
0.45881932973861694,
-0.5533426403999329,
0.44309306144714355,
-0.11162199079990387,
-0.3989333212375641,
0.7818713784217834,
0.5460984706878662,
-1.0348823070526123,
-0.6921914219856262,
-0.45854389667510986,
0.60924077033996... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
AppleHarem/midori_bluearchive | AppleHarem | 2023-11-24T00:42:42Z | 0 | 0 | null | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2023-11-24T00:42:42Z | 2023-11-24T00:42:09.000Z | 2023-11-24T00:42:09 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of midori (Blue Archive)
This is the dataset of midori (Blue Archive), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 556 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 676 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 556 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 556 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 518 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 676 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 676 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| [
-0.6582289338111877,
-0.25132808089256287,
0.4271332323551178,
0.21938897669315338,
-0.23640449345111847,
-0.1374235600233078,
0.138790100812912,
-0.4832979738712311,
0.684255063533783,
0.5493781566619873,
-0.9223673343658447,
-0.6820986866950989,
-0.6063721179962158,
0.4679458439350128,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
snats/chico_prompts_suggest_title | snats | 2023-11-24T00:49:22Z | 0 | 0 | null | [
"language:es",
"license:cc-by-4.0",
"region:us"
] | 2023-11-24T00:49:22Z | 2023-11-24T00:46:15.000Z | 2023-11-24T00:46:15 | ---
license: cc-by-4.0
language:
- es
---
# Dataset Card for Dataset Name
This dataset provides around 8,000 prompts in Spanish to suggest titles from snippets of short stories.
The following is the prompt in english:
```
prompt:
Suggest a title for the following story:
{{contents}}
completion:
Sure, here's a suitable title for the given story {{titles}}.
```
In spanish:
```
prompt:
Sugiere un título para la siguiente historia: {{contents}}
completion:
Un título posible para la siguiente historia podría ser: {{titles}}
```
# More Information
This dataset is a sub-version of the original [chico dataset](https://huggingface.co/datasets/snats/chico). | [
-0.3956160247325897,
-0.45731303095817566,
0.41553792357444763,
0.47612208127975464,
-0.6755273342132568,
0.3028864562511444,
-0.190086230635643,
-0.16817061603069305,
0.8015180826187134,
0.6260256171226501,
-0.9278314709663391,
-0.610328733921051,
-0.5834217667579651,
0.5432795882225037,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
alpayariyak/wildchat-code-gpt4 | alpayariyak | 2023-11-24T02:50:41Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T02:50:41Z | 2023-11-24T01:19:37.000Z | 2023-11-24T01:19:37 | ---
dataset_info:
features:
- name: conversation_id
dtype: string
- name: model
dtype: string
- name: timestamp
dtype: timestamp[s, tz=UTC]
- name: conversation
list:
- name: content
dtype: string
- name: language
dtype: string
- name: redacted
dtype: bool
- name: role
dtype: string
- name: toxic
dtype: bool
- name: turn
dtype: int64
- name: language
dtype: string
- name: openai_moderation
list:
- name: categories
struct:
- name: harassment
dtype: bool
- name: harassment/threatening
dtype: bool
- name: hate
dtype: bool
- name: hate/threatening
dtype: bool
- name: self-harm
dtype: bool
- name: self-harm/instructions
dtype: bool
- name: self-harm/intent
dtype: bool
- name: sexual
dtype: bool
- name: sexual/minors
dtype: bool
- name: violence
dtype: bool
- name: violence/graphic
dtype: bool
- name: category_scores
struct:
- name: harassment
dtype: float64
- name: harassment/threatening
dtype: float64
- name: hate
dtype: float64
- name: hate/threatening
dtype: float64
- name: self-harm
dtype: float64
- name: self-harm/instructions
dtype: float64
- name: self-harm/intent
dtype: float64
- name: sexual
dtype: float64
- name: sexual/minors
dtype: float64
- name: violence
dtype: float64
- name: violence/graphic
dtype: float64
- name: flagged
dtype: bool
- name: detoxify_moderation
list:
- name: identity_attack
dtype: float32
- name: insult
dtype: float32
- name: obscene
dtype: float32
- name: severe_toxicity
dtype: float32
- name: sexual_explicit
dtype: float32
- name: threat
dtype: float32
- name: toxicity
dtype: float32
- name: toxic
dtype: bool
- name: redacted
dtype: bool
- name: input_1
dtype: string
- name: output_1
dtype: string
splits:
- name: train
num_bytes: 68778365.78381307
num_examples: 8230
download_size: 49740612
dataset_size: 68778365.78381307
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wildchat-code-gpt4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.4062308073043823,
-0.3468243479728699,
0.13536690175533295,
0.4537937343120575,
-0.4110478162765503,
0.1254044771194458,
0.20088429749011993,
-0.16839566826820374,
0.807844340801239,
0.39416491985321045,
-0.9995335936546326,
-0.8026184439659119,
-0.4499479830265045,
-0.04160848259925842... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
js282979/kepler | js282979 | 2023-11-24T01:39:16Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T01:39:16Z | 2023-11-24T01:38:22.000Z | 2023-11-24T01:38:22 | Entry not found | [
-0.32276472449302673,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965679168701,
0.7915717363357544,
0.07618629932403564,
0.7746022939682007,
0.2563222646713257,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
PahaII/vllm_safety_evaluation | PahaII | 2023-11-28T02:35:11Z | 0 | 1 | null | [
"license:apache-2.0",
"arxiv:2311.16101",
"region:us"
] | 2023-11-28T02:35:11Z | 2023-11-24T01:57:59.000Z | 2023-11-24T01:57:59 | ---
license: apache-2.0
---
# How Many Unicorns Are In This Image? A Safety Evaluation Benchmark For Vision LLMs (Dataset)
Paper: https://arxiv.org/abs/2311.16101
Code: https://github.com/UCSC-VLAA/vllm-safety-benchmark
The full dataset should looks like this:
```
.
├── ./safety_evaluation_benchmark_datasets//
├── gpt4v_challenging_set # Contains the challenging test data for GPT4V
├── attack_images
├── sketchy_images
├── oodcv_images
├── misleading-attack.json
├── sketchy-vqa-challenging.json
└── oodcv-vqa-counterfactual.json
├── redteaming-mislead # Contains the test data for redteaming tasks
├── redteaming_attack
├── gaussian_noise
├── mixattack_eps32
├── mixattack_eps64
├── sinattack_eps64_dog
├── sinattack_eps64_coconut
├── sinattack_eps64_spaceship
└── annotation.json
└── jailbreak_llm # adversarial suffixes for jailbreaking VLLM through LLM
└── ood # Contains the test data for OOD scenarios
├── sketchy-vqa
├── sketchy-vqa.json
├── sketchy-challenging.json
└── oodcv-vqa
├── oodcv-vqa.json
└── oodcv-counterfactual.json
``` | [
-0.27893897891044617,
-0.45042160153388977,
0.30688124895095825,
0.32628828287124634,
-0.5100463032722473,
0.0080459825694561,
0.5544990301132202,
-0.3349224030971527,
0.01853131130337715,
0.6462295651435852,
-0.5401781797409058,
-0.8317402601242065,
-0.5248475074768066,
0.2073291838169098... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
inswave/AISquare_Koalpaca_Orca_merged | inswave | 2023-11-24T02:08:22Z | 0 | 0 | null | [
"task_categories:text-generation",
"language:ko",
"license:cc-by-nc-4.0",
"region:us"
] | 2023-11-24T02:08:22Z | 2023-11-24T02:02:34.000Z | 2023-11-24T02:02:34 | ---
license: cc-by-nc-4.0
task_categories:
- text-generation
language:
- ko
--- | [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
promptboom/pserver-offline-0.0.0-windows | promptboom | 2023-11-24T03:31:03Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T03:31:03Z | 2023-11-24T02:49:24.000Z | 2023-11-24T02:49:24 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Zhongyuan/xl-gen | Zhongyuan | 2023-11-24T03:31:37Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-24T03:31:37Z | 2023-11-24T02:54:04.000Z | 2023-11-24T02:54:04 | ---
license: openrail
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
HamdanXI/arb-eng-parallel-10k-splitted-cosine-5 | HamdanXI | 2023-11-24T02:54:18Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T02:54:18Z | 2023-11-24T02:54:16.000Z | 2023-11-24T02:54:16 | ---
dataset_info:
features:
- name: arabic
dtype: string
- name: english
dtype: string
splits:
- name: train
num_bytes: 3013957
num_examples: 6489
- name: validation
num_bytes: 407437
num_examples: 1000
- name: test
num_bytes: 419389
num_examples: 1000
download_size: 2165455
dataset_size: 3840783
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
yentinglin/Taiwan-MT-Bench | yentinglin | 2023-11-24T03:27:10Z | 0 | 0 | null | [
"task_categories:table-question-answering",
"task_categories:question-answering",
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:zh",
"license:apache-2.0",
"region:us"
] | 2023-11-24T03:27:10Z | 2023-11-24T03:23:40.000Z | 2023-11-24T03:23:40 | ---
license: apache-2.0
task_categories:
- table-question-answering
- question-answering
- text-generation
language:
- zh
size_categories:
- 1K<n<10K
pretty_name: TWMTBench
data_files:
- split: test
path: Taiwan-MT-Bench.jsonl
--- | [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
wicked-ai/LLM | wicked-ai | 2023-11-24T03:43:56Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-24T03:43:56Z | 2023-11-24T03:31:03.000Z | 2023-11-24T03:31:03 | ---
license: apache-2.0
---
| [
-0.128533735871315,
-0.18616747856140137,
0.6529128551483154,
0.4943627715110779,
-0.19319336116313934,
0.2360745221376419,
0.3607197701931,
0.05056330934166908,
0.5793653130531311,
0.740013837814331,
-0.6508103013038635,
-0.23783954977989197,
-0.7102248668670654,
-0.04782583937048912,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
openerotica/erotiquant-xl | openerotica | 2023-11-24T03:50:06Z | 0 | 3 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-24T03:50:06Z | 2023-11-24T03:46:10.000Z | 2023-11-24T03:46:10 | ---
license: apache-2.0
---
I added the longer context samples from erotica-analysys and removed the smaller samples from the original erotiquant. The minimum context size is 8000 per sample. | [
-0.5550714135169983,
-0.5582371950149536,
0.2778047025203705,
0.4386369287967682,
-0.623884379863739,
-1.0985169410705566,
0.120455801486969,
-0.5495879650115967,
0.7450664043426514,
0.7721987962722778,
-0.3850525915622711,
-0.27122095227241516,
-0.2087816298007965,
0.3850715458393097,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
ErhaChen/minecraft | ErhaChen | 2023-11-24T03:57:20Z | 0 | 0 | null | [
"task_categories:text-to-image",
"license:apache-2.0",
"minecraft",
"style",
"region:us"
] | 2023-11-24T03:57:20Z | 2023-11-24T03:53:26.000Z | 2023-11-24T03:53:26 | ---
license: apache-2.0
task_categories:
- text-to-image
tags:
- minecraft
- style
--- | [
-0.128533735871315,
-0.18616747856140137,
0.6529128551483154,
0.4943627715110779,
-0.19319336116313934,
0.2360745221376419,
0.3607197701931,
0.05056330934166908,
0.5793653130531311,
0.740013837814331,
-0.6508103013038635,
-0.23783954977989197,
-0.7102248668670654,
-0.04782583937048912,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
HamdanXI/arb-eng-parallel-10k-splitted-euclidean-100 | HamdanXI | 2023-11-24T04:01:59Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T04:01:59Z | 2023-11-24T04:01:55.000Z | 2023-11-24T04:01:55 | ---
dataset_info:
features:
- name: arabic
dtype: string
- name: english
dtype: string
splits:
- name: train
num_bytes: 3054333
num_examples: 6636
- name: validation
num_bytes: 407437
num_examples: 1000
- name: test
num_bytes: 419389
num_examples: 1000
download_size: 2193837
dataset_size: 3881159
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
| [
-0.128533735871315,
-0.18616747856140137,
0.6529128551483154,
0.4943627715110779,
-0.19319336116313934,
0.2360745221376419,
0.3607197701931,
0.05056330934166908,
0.5793653130531311,
0.740013837814331,
-0.6508103013038635,
-0.23783954977989197,
-0.7102248668670654,
-0.04782583937048912,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
nguyenthanhdo/webglm_vi | nguyenthanhdo | 2023-11-24T04:04:16Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T04:04:16Z | 2023-11-24T04:04:11.000Z | 2023-11-24T04:04:11 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: references
sequence: string
- name: translated
dtype: bool
splits:
- name: train
num_bytes: 154260166
num_examples: 43579
download_size: 77675307
dataset_size: 154260166
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "webglm_vi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.7430773973464966,
-0.3869355320930481,
0.1220831647515297,
0.14001883566379547,
-0.15754124522209167,
0.024794679135084152,
0.14110393822193146,
-0.15545104444026947,
0.5130682587623596,
0.3798552453517914,
-0.8346767425537109,
-1.0150854587554932,
-0.5222765207290649,
-0.32482397556304... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Yvan-W/WSABuild | Yvan-W | 2023-11-24T04:39:11Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-24T04:39:11Z | 2023-11-24T04:39:11.000Z | 2023-11-24T04:39:11 | ---
license: openrail
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
enniA/guindita_data | enniA | 2023-11-24T04:47:37Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T04:47:37Z | 2023-11-24T04:47:37.000Z | 2023-11-24T04:47:37 | Entry not found | [
-0.3227647542953491,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965083122253,
0.7915717959403992,
0.07618629932403564,
0.7746022343635559,
0.2563222348690033,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
lauransotomayor/eco_composition | lauransotomayor | 2023-11-24T05:07:40Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-24T05:07:40Z | 2023-11-24T04:56:04.000Z | 2023-11-24T04:56:04 | ---
license: mit
---
Data sample for testing DL code
| [
-0.34294408559799194,
-0.7041491866111755,
0.15603604912757874,
0.008206188678741455,
-0.06455398350954056,
-0.2673529386520386,
0.41735950112342834,
-0.11966381967067719,
-0.04027526080608368,
0.630962610244751,
-0.9241701364517212,
-0.7780186533927917,
0.24647220969200134,
0.038876838982... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Beerth/marceloxz | Beerth | 2023-11-24T05:00:59Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-24T05:00:59Z | 2023-11-24T05:00:33.000Z | 2023-11-24T05:00:33 | ---
license: openrail
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
CS5647Team3/data_mini | CS5647Team3 | 2023-11-24T05:16:24Z | 0 | 0 | null | [
"task_categories:token-classification",
"size_categories:100M<n<1B",
"language:zh",
"tone",
"pinyin",
"sentence",
"audio",
"region:us"
] | 2023-11-24T05:16:24Z | 2023-11-24T05:04:37.000Z | 2023-11-24T05:04:37 | ---
task_categories:
- token-classification
language:
- zh
tags:
- tone
- pinyin
- sentence
- audio
size_categories:
- 100M<n<1B
---
## Dataset Details
- Welcome to the Single-Speaker Mandarin Audio Dataset! This dataset is a curated subset extracted from a larger collection, focusing on audio recordings of a single speaker. Each audio file is accompanied by valuable linguistic annotations, including Pinyin transcriptions, tone information, and onset and offset details.
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Speaker:** The dataset exclusively features recordings of a single Mandarin speaker, providing consistency for various linguistic analyses and applications.
- **Pinyin Transcriptions:** Each audio file comes with a corresponding Pinyin transcription, offering a phonetic representation of the spoken Mandarin.
- **Tone Information:** Tone annotations are included to capture the tonal characteristics of the spoken language. This feature is essential for tone-related studies and applications.
- **Onset and Offset Details:** Precise information about the onset and offset of each audio segment is provided. This allows for accurate segmentation and analysis of the spoken content.
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- Subset of the Original Kaggle Dataset
## Uses
- Use for model evaluation or demo
| [
-0.21504493057727814,
-0.39725127816200256,
-0.1508447527885437,
0.42922767996788025,
-0.2725014388561249,
-0.09507620334625244,
-0.4121725261211395,
0.06637819856405258,
0.3996436893939972,
1.001373291015625,
-0.7627237439155579,
-0.7457211017608643,
-0.1330738663673401,
-0.03946234658360... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
mesolitica/chatgpt4-ayat-aktif-pasif | mesolitica | 2023-11-28T22:53:59Z | 0 | 0 | null | [
"region:us"
] | 2023-11-28T22:53:59Z | 2023-11-24T05:21:05.000Z | 2023-11-24T05:21:05 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622264862060547,
0.43461528420448303,
-0.52829909324646,
0.7012971639633179,
0.7915720343589783,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104477167129517,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
TheGreatP/MinhaVoz3 | TheGreatP | 2023-11-24T05:24:40Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-24T05:24:40Z | 2023-11-24T05:24:16.000Z | 2023-11-24T05:24:16 | ---
license: openrail
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
CS5647Team3/full_dataset | CS5647Team3 | 2023-11-24T05:28:48Z | 0 | 0 | null | [
"task_categories:text-classification",
"language:zh",
"tone",
"pinyin",
"region:us"
] | 2023-11-24T05:28:48Z | 2023-11-24T05:26:28.000Z | 2023-11-24T05:26:28 | ---
task_categories:
- text-classification
language:
- zh
tags:
- tone
- pinyin
---
You can go to Kaggle to find the full amount of the dataset
Paddle Speech -> AISHELL-3 -> Train
https://www.kaggle.com/datasets/zenbot99/paddle-speech/ | [
-0.16762012243270874,
-0.5016583204269409,
0.2277408242225647,
0.18814226984977722,
-0.3590928316116333,
0.038783349096775055,
0.03692896291613579,
-0.3672022819519043,
0.46040090918540955,
0.7444222569465637,
-0.9832606911659241,
-0.658041775226593,
-0.630088210105896,
-0.4865451753139496... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
babaka/chicken_dataset | babaka | 2023-11-24T05:28:53Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-24T05:28:53Z | 2023-11-24T05:28:53.000Z | 2023-11-24T05:28:53 | ---
license: mit
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Nimitzzz/LLamadatademo | Nimitzzz | 2023-11-24T06:07:48Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T06:07:48Z | 2023-11-24T06:02:33.000Z | 2023-11-24T06:02:33 | ---
# For reference on dataset card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/datasets-cards
{}
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | [
-0.5322356224060059,
-0.5534716844558716,
0.1290130317211151,
0.23470577597618103,
-0.39626216888427734,
-0.11762470006942749,
-0.03545305132865906,
-0.6389272212982178,
0.5699822306632996,
0.7838326692581177,
-0.7834625840187073,
-0.9173274040222168,
-0.55633145570755,
0.13078093528747559... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
AppleHarem/mari_bluearchive | AppleHarem | 2023-11-24T06:27:23Z | 0 | 0 | null | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2023-11-24T06:27:23Z | 2023-11-24T06:27:00.000Z | 2023-11-24T06:27:00 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mari (Blue Archive)
This is the dataset of mari (Blue Archive), containing 150 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 150 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 409 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 475 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 150 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 150 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 150 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 409 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 409 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 324 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 475 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 475 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| [
-0.6492072939872742,
-0.2851469814777374,
0.39837175607681274,
0.2396152764558792,
-0.1790284365415573,
-0.20167651772499084,
0.16408361494541168,
-0.49020063877105713,
0.7302085161209106,
0.6351835131645203,
-0.8959088325500488,
-0.7784760594367981,
-0.566567063331604,
0.43110811710357666... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
TheGreatP/LeoMorachiolli | TheGreatP | 2023-11-24T06:32:50Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-24T06:32:50Z | 2023-11-24T06:32:20.000Z | 2023-11-24T06:32:20 | ---
license: openrail
---
| [
-0.1285339742898941,
-0.18616800010204315,
0.6529127359390259,
0.4943626821041107,
-0.1931934952735901,
0.2360742688179016,
0.360720157623291,
0.05056300014257431,
0.5793654322624207,
0.7400140166282654,
-0.6508105993270874,
-0.23783984780311584,
-0.7102248668670654,
-0.047826044261455536,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
agibot-zy/zy_dataset_jaka | agibot-zy | 2023-11-24T07:03:17Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-24T07:03:17Z | 2023-11-24T07:03:17.000Z | 2023-11-24T07:03:17 | ---
license: mit
---
| [
-0.1285339742898941,
-0.18616800010204315,
0.6529127359390259,
0.4943626821041107,
-0.1931934952735901,
0.2360742688179016,
0.360720157623291,
0.05056300014257431,
0.5793654322624207,
0.7400140166282654,
-0.6508105993270874,
-0.23783984780311584,
-0.7102248668670654,
-0.047826044261455536,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
nguyenthanhdo/ultrachat-aem-alpaca-v2.0 | nguyenthanhdo | 2023-11-24T07:49:15Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T07:49:15Z | 2023-11-24T07:03:31.000Z | 2023-11-24T07:03:31 | ```py
from datasets import load_dataset
ultra = load_dataset(
"stingning/ultrachat",
data_files=[
"train_*.jsonl"
],
split="train"
)
def get_first_turn(example):
data = example["data"]
instruction, output = data[0], data[1]
example.pop("data")
example["instruction"] = instruction
example["input"] = ''
example["output"] = output
return example
## Assistance on Existing Materials
aem_signatures = [
# "Based on the text material above, generate the response to the following quesion or instruction",
"based on the text material",
"given the text",
"here is a piece of text",
# "Generate response to the question/instruction based on a piece of given material",
"based on a piece of given material"
"based on the passage above",
"answer according to",
"generate according to",
# Read the passage below and answer the question or follow the instruction
"read the passage below",
# "paraphrase",
# "rewrit",
# "article",
# "context",
# "passage",
# "summa"
]
def aem(example):
data = example["data"]
first_instruction = data[0]
flag = False
if any([kw in first_instruction.lower() for kw in aem_signatures]):
flag = True
return flag
ultra_aem = ultra.filter(aem)
ultra_aem_first_turn = ultra_aem.map(get_first_turn)
ultra_aem.push_to_hub("nguyenthanhdo/ultrachat-aem-v2.0")
ultra_aem_first_turn.push_to_hub("nguyenthanhdo/ultrachat-aem-alpaca-v2.0")
``` | [
-0.17738430202007294,
-0.7095452547073364,
0.18546460568904877,
0.0724346786737442,
-0.2528722286224365,
-0.037963759154081345,
-0.14457745850086212,
-0.052817147225141525,
0.18430976569652557,
0.8421397805213928,
-0.6505528688430786,
-0.6546158790588379,
-0.33874091506004333,
0.2652065157... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Badrish/coding_dataset | Badrish | 2023-11-24T07:12:11Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-24T07:12:11Z | 2023-11-24T07:12:11.000Z | 2023-11-24T07:12:11 | ---
license: apache-2.0
---
| [
-0.1285339742898941,
-0.18616800010204315,
0.6529127359390259,
0.4943626821041107,
-0.1931934952735901,
0.2360742688179016,
0.360720157623291,
0.05056300014257431,
0.5793654322624207,
0.7400140166282654,
-0.6508105993270874,
-0.23783984780311584,
-0.7102248668670654,
-0.047826044261455536,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
youngwoo3283/df_sentiment_1 | youngwoo3283 | 2023-11-24T07:13:14Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T07:13:14Z | 2023-11-24T07:13:13.000Z | 2023-11-24T07:13:13 | Entry not found | [
-0.3227648138999939,
-0.22568459808826447,
0.8622260093688965,
0.43461498618125916,
-0.5282989144325256,
0.701296329498291,
0.7915719151496887,
0.07618649303913116,
0.7746025323867798,
0.2563220262527466,
-0.7852813601493835,
-0.22573833167552948,
-0.9104480743408203,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
jonghyunlee/UniProt_functional_text_descriptions | jonghyunlee | 2023-11-24T07:24:31Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T07:24:31Z | 2023-11-24T07:22:41.000Z | 2023-11-24T07:22:41 | Entry not found | [
-0.3227648138999939,
-0.22568459808826447,
0.8622260093688965,
0.43461498618125916,
-0.5282989144325256,
0.701296329498291,
0.7915719151496887,
0.07618649303913116,
0.7746025323867798,
0.2563220262527466,
-0.7852813601493835,
-0.22573833167552948,
-0.9104480743408203,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
atdexo/your-dataset-name | atdexo | 2023-11-24T07:23:01Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T07:23:01Z | 2023-11-24T07:23:01.000Z | 2023-11-24T07:23:01 | Entry not found | [
-0.3227648138999939,
-0.22568459808826447,
0.8622260093688965,
0.43461498618125916,
-0.5282989144325256,
0.701296329498291,
0.7915719151496887,
0.07618649303913116,
0.7746025323867798,
0.2563220262527466,
-0.7852813601493835,
-0.22573833167552948,
-0.9104480743408203,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
mtkinit/mtkinit_small_sentiment_dataset | mtkinit | 2023-11-24T07:28:15Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T07:28:15Z | 2023-11-24T07:28:14.000Z | 2023-11-24T07:28:14 | ---
pretty_name: mtkinit/small_sentiment_dataset
---
# mtkinit/small_sentiment_dataset
Created from AIOD platform | [
-0.6032553315162659,
-0.15998584032058716,
0.007496681530028582,
0.3491162061691284,
-0.40441349148750305,
0.1346043199300766,
0.16208098828792572,
0.13515719771385193,
0.8520936369895935,
0.5329465270042419,
-0.48046261072158813,
-0.5821521282196045,
-0.565609335899353,
-0.248154133558273... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
nguyenthanhdo/dolphin-cqa-v1.0 | nguyenthanhdo | 2023-11-24T07:35:08Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T07:35:08Z | 2023-11-24T07:34:54.000Z | 2023-11-24T07:34:54 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: lang
dtype: string
splits:
- name: en
num_bytes: 187712666
num_examples: 79999
- name: vi
num_bytes: 248492204
num_examples: 79999
download_size: 233755470
dataset_size: 436204870
configs:
- config_name: default
data_files:
- split: en
path: data/en-*
- split: vi
path: data/vi-*
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
skywalkeryjp/clip | skywalkeryjp | 2023-11-24T07:43:31Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T07:43:31Z | 2023-11-24T07:43:31.000Z | 2023-11-24T07:43:31 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
vishalsmb/guanaco-llama2-1k | vishalsmb | 2023-11-24T07:48:37Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T07:48:37Z | 2023-11-24T07:48:32.000Z | 2023-11-24T07:48:32 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966692
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
AnonymousUserCLIP4MC/CLIP4MC | AnonymousUserCLIP4MC | 2023-11-24T07:48:35Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T07:48:35Z | 2023-11-24T07:48:35.000Z | 2023-11-24T07:48:35 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
fptu/vlrw | fptu | 2023-11-24T08:50:35Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T08:50:35Z | 2023-11-24T07:55:33.000Z | 2023-11-24T07:55:33 | Entry not found | [
-0.3227648138999939,
-0.22568459808826447,
0.8622260093688965,
0.43461498618125916,
-0.5282989144325256,
0.701296329498291,
0.7915719151496887,
0.07618649303913116,
0.7746025323867798,
0.2563220262527466,
-0.7852813601493835,
-0.22573833167552948,
-0.9104480743408203,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
HamdanXI/arb-eng-parallel-10k-splitted-euclidean-80 | HamdanXI | 2023-11-24T08:01:29Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T08:01:29Z | 2023-11-24T08:01:27.000Z | 2023-11-24T08:01:27 | ---
dataset_info:
features:
- name: arabic
dtype: string
- name: english
dtype: string
splits:
- name: train
num_bytes: 339531
num_examples: 666
- name: validation
num_bytes: 407437
num_examples: 1000
- name: test
num_bytes: 419389
num_examples: 1000
download_size: 664054
dataset_size: 1166357
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
| [
-0.12853379547595978,
-0.18616773188114166,
0.6529127955436707,
0.4943625330924988,
-0.19319316744804382,
0.23607458174228668,
0.36071985960006714,
0.05056329071521759,
0.5793651938438416,
0.740013837814331,
-0.6508100628852844,
-0.23783975839614868,
-0.710224986076355,
-0.0478257611393928... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
mtkinit/mtkinit_tcb_small_sentiment_dataset | mtkinit | 2023-11-24T08:09:57Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T08:09:57Z | 2023-11-24T08:09:56.000Z | 2023-11-24T08:09:56 | ---
pretty_name: mtkinit/tcb_small_sentiment_dataset
---
# mtkinit/tcb_small_sentiment_dataset
Created from AIOD platform | [
-0.5391944050788879,
-0.29561468958854675,
0.022963937371969223,
0.4196018874645233,
-0.4723025858402252,
0.38424867391586304,
0.18130943179130554,
0.1341007947921753,
0.8127323985099792,
0.509091854095459,
-0.4526199996471405,
-0.5710681676864624,
-0.5243983864784241,
-0.31599679589271545... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Chu0113/hekk | Chu0113 | 2023-11-24T08:32:23Z | 0 | 0 | null | [
"license:cc",
"region:us"
] | 2023-11-24T08:32:23Z | 2023-11-24T08:13:28.000Z | 2023-11-24T08:13:28 | ---
license: cc
dataset_info:
- config_name: all
features:
- name: data_index_by_user
dtype: int32
- name: article
dtype: string
- name: answer
dtype: string
- name: question
dtype: string
- name: options
sequence: string
splits:
- name: train
num_bytes: 191129599
num_examples: 87866
- name: validation
num_bytes: 10507580
num_examples: 4887
- name: test
num_bytes: 10668488
num_examples: 4934
download_size: 46954865
dataset_size: 212305667
- config_name: middle
features:
- name: data_index_by_user
dtype: int32
- name: article
dtype: string
- name: answer
dtype: string
- name: question
dtype: string
- name: options
sequence: string
splits:
- name: train
num_bytes: 191129599
num_examples: 87866
- name: validation
num_bytes: 10507580
num_examples: 4887
- name: test
num_bytes: 10668488
num_examples: 4934
download_size: 46954865
dataset_size: 212305667
configs:
- config_name: all
data_files:
- split: train
path: all/train-*
- split: validation
path: all/validation-*
- split: test
path: all/test-*
- config_name: middle
data_files:
- split: train
path: middle/train-*
- split: validation
path: middle/validation-*
- split: test
path: middle/test-*
---
| [
-0.12853379547595978,
-0.18616773188114166,
0.6529127955436707,
0.4943625330924988,
-0.19319316744804382,
0.23607458174228668,
0.36071985960006714,
0.05056329071521759,
0.5793651938438416,
0.740013837814331,
-0.6508100628852844,
-0.23783975839614868,
-0.710224986076355,
-0.0478257611393928... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Singhoo/gsm8k_dev | Singhoo | 2023-11-24T08:17:59Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T08:17:59Z | 2023-11-24T08:17:14.000Z | 2023-11-24T08:17:14 | 1500 samples forom the train split. | [
-0.8377229571342468,
-0.12603288888931274,
0.0452447272837162,
0.6520413160324097,
-0.13774241507053375,
0.11181318759918213,
0.38735756278038025,
-0.20726613700389862,
0.826856791973114,
0.7678831815719604,
-0.6560665965080261,
0.23639528453350067,
-0.36823830008506775,
0.0671614408493042... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
figenfikri/stsb_tr | figenfikri | 2023-11-24T08:28:16Z | 0 | 0 | null | [
"task_categories:text-classification",
"task_ids:text-scoring",
"task_ids:semantic-similarity-scoring",
"annotations_creators:crowdsourced",
"language_creators:machine-generated",
"multilinguality:monolingual",
"size_categories:1K<n<10K",
"source_datasets:extended|other-sts-b",
"language:tr",
"reg... | 2023-11-24T08:28:16Z | 2023-11-24T08:26:52.000Z | 2023-11-24T08:26:52 | ---
annotations_creators:
- crowdsourced
language_creators:
- machine-generated
language:
- tr
multilinguality:
- monolingual
pretty_name: Semantic Textual Similarity in Turkish
size_categories:
- 1K<n<10K
source_datasets:
- extended|other-sts-b
task_categories:
- text-classification
task_ids:
- text-scoring
- semantic-similarity-scoring
---
## Dataset Description
Repository: https://github.com/verimsu/STSb-TR
### Dataset Summary
STSb-TR dataset is the machine translated version of English STS benchmark dataset using Google Cloud Translation API.
### Citation
```
@inproceedings{beken-fikri-etal-2021-semantic,
title = "Semantic Similarity Based Evaluation for Abstractive News Summarization",
author = "Beken Fikri, Figen and Oflazer, Kemal and Yanikoglu, Berrin",
booktitle = "Proceedings of the 1st Workshop on Natural Language Generation, Evaluation, and Metrics (GEM 2021)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.gem-1.3",
doi = "10.18653/v1/2021.gem-1.3",
pages = "24--33",
abstract = "ROUGE is a widely used evaluation metric in text summarization. However, it is not suitable for the evaluation of abstractive summarization systems as it relies on lexical overlap between the gold standard and the generated summaries. This limitation becomes more apparent for agglutinative languages with very large vocabularies and high type/token ratios. In this paper, we present semantic similarity models for Turkish and apply them as evaluation metrics for an abstractive summarization task. To achieve this, we translated the English STSb dataset into Turkish and presented the first semantic textual similarity dataset for Turkish as well. We showed that our best similarity models have better alignment with average human judgments compared to ROUGE in both Pearson and Spearman correlations.",
}
``` | [
-0.20711897313594818,
-0.5890638828277588,
0.33579376339912415,
0.06132384389638901,
-0.6705938577651978,
0.23696011304855347,
-0.29537737369537354,
-0.4639454483985901,
0.42617490887641907,
0.45107510685920715,
-0.39744648337364197,
-0.9380436539649963,
-0.7444368600845337,
0.239536628127... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
HamdanXI/arb-eng-parallel-10k-splitted-euclidean-95 | HamdanXI | 2023-11-24T08:41:25Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T08:41:25Z | 2023-11-24T08:41:23.000Z | 2023-11-24T08:41:23 | ---
dataset_info:
features:
- name: arabic
dtype: string
- name: english
dtype: string
splits:
- name: train
num_bytes: 2506374
num_examples: 5227
- name: validation
num_bytes: 407437
num_examples: 1000
- name: test
num_bytes: 419389
num_examples: 1000
download_size: 1889682
dataset_size: 3333200
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
91stefan/class_man_women | 91stefan | 2023-11-25T11:30:57Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T11:30:57Z | 2023-11-24T09:41:12.000Z | 2023-11-24T09:41:12 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622264862060547,
0.43461528420448303,
-0.52829909324646,
0.7012971639633179,
0.7915720343589783,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104477167129517,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
mtkinit/mtkinit_TCB_sentiment_dataset | mtkinit | 2023-11-24T09:44:25Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T09:44:25Z | 2023-11-24T09:44:24.000Z | 2023-11-24T09:44:24 | ---
pretty_name: mtkinit/TCB-sentiment-dataset
---
# mtkinit/TCB-sentiment-dataset
Created from AIOD platform | [
-0.5027194023132324,
-0.23691558837890625,
-0.023457705974578857,
0.4944406747817993,
-0.4419974684715271,
0.5056309700012207,
0.21807484328746796,
0.032450709491968155,
0.7808594703674316,
0.6407104134559631,
-0.506276547908783,
-0.6920080780982971,
-0.5587315559387207,
-0.358479708433151... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
BarrenWardo/SDControlNets | BarrenWardo | 2023-11-24T09:55:14Z | 0 | 0 | null | [
"license:unknown",
"region:us"
] | 2023-11-24T09:55:14Z | 2023-11-24T09:55:13.000Z | 2023-11-24T09:55:13 | ---
license: unknown
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Medilora/mimic_iii_diagnosis_anonymous | Medilora | 2023-11-24T23:08:04Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-24T23:08:04Z | 2023-11-24T09:56:12.000Z | 2023-11-24T09:56:12 | ---
license: mit
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
trymtv/norwegian-parliament-speeches | trymtv | 2023-11-24T10:14:58Z | 0 | 0 | null | [
"task_categories:text-classification",
"size_categories:100K<n<1M",
"language:no",
"license:cc0-1.0",
"region:us"
] | 2023-11-24T10:14:58Z | 2023-11-24T10:06:57.000Z | 2023-11-24T10:06:57 | ---
license: cc0-1.0
task_categories:
- text-classification
language:
- 'no'
pretty_name: Norwegian parliament speeches
size_categories:
- 100K<n<1M
---
# Dataset Card for Dataset Name
## Dataset Details
### Dataset Description
Speeches from the Norwegian parliament from 1998 and 2022. Parsed from the Norwegian part of the EU ParlaMint, ParlaMint-NO
### Dataset Sources
Source: https://www.nb.no/sprakbanken/en/resource-catalogue/oai-nb-no-sbr-77/ | [
-0.2745814919471741,
-0.4707110822200775,
-0.02095162495970726,
0.02577732503414154,
-0.630771815776825,
-0.23204943537712097,
0.002021871507167816,
-0.024851344525814056,
0.42593950033187866,
0.9388942718505859,
-0.52122563123703,
-0.3653091788291931,
-0.3533751666545868,
0.14102570712566... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Pablao0948/Wynq | Pablao0948 | 2023-11-24T10:26:40Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-24T10:26:40Z | 2023-11-24T10:19:48.000Z | 2023-11-24T10:19:48 | ---
license: openrail
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
patnelt60/train_data_tok1 | patnelt60 | 2023-11-24T10:22:31Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T10:22:31Z | 2023-11-24T10:22:31.000Z | 2023-11-24T10:22:31 | Entry not found | [
-0.3227647542953491,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965083122253,
0.7915717959403992,
0.07618629932403564,
0.7746022343635559,
0.2563222348690033,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
ivne20/reveal | ivne20 | 2023-11-24T10:39:39Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T10:39:39Z | 2023-11-24T10:39:36.000Z | 2023-11-24T10:39:36 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype:
class_label:
names:
'0': 'False'
'1': 'True'
splits:
- name: train
num_bytes: 26661950
num_examples: 18187
- name: test
num_bytes: 6648520
num_examples: 4547
download_size: 11878633
dataset_size: 33310470
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
bilallllllll/maow | bilallllllll | 2023-11-24T10:42:14Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T10:42:14Z | 2023-11-24T10:42:09.000Z | 2023-11-24T10:42:09 | ---
dataset_info:
features:
- name: id
dtype: int32
- name: previous_frame
dtype: image
- name: next_frame
dtype: image
- name: next_frame_pose
dtype: image
splits:
- name: train
num_bytes: 2492565.0
num_examples: 12
download_size: 2473017
dataset_size: 2492565.0
---
# Dataset Card for "maow"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.5177006721496582,
-0.10184880346059799,
0.18891872465610504,
0.4996451735496521,
-0.23204763233661652,
-0.22081111371517181,
0.28316396474838257,
-0.22708509862422943,
0.668353796005249,
0.5720097422599792,
-0.857915997505188,
-0.9409410357475281,
-0.5333800911903381,
-0.226156175136566... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
shirshakach/Llama-Dataset | shirshakach | 2023-11-27T01:45:16Z | 0 | 0 | null | [
"region:us"
] | 2023-11-27T01:45:16Z | 2023-11-24T10:45:03.000Z | 2023-11-24T10:45:03 | ---
dataset_info:
features:
- name: query_id
dtype: int32
- name: answers
sequence: string
- name: passages
struct:
- name: is_selected
sequence: int32
- name: passage_text
sequence: string
- name: url
sequence: string
- name: query
dtype: string
- name: query_type
dtype: string
- name: wellFormedAnswers
sequence: 'null'
- name: ai_answers
dtype: string
- name: query_len
dtype: int64
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 22204146
num_examples: 5000
download_size: 10885468
dataset_size: 22204146
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
OpenNLPLab/FAVDBench | OpenNLPLab | 2023-11-26T07:50:06Z | 0 | 1 | null | [
"size_categories:10K<n<100K",
"language:en",
"language:zh",
"license:apache-2.0",
"FAVD",
"FAVDBench",
"Video Description",
"Audio Description",
"Audible Video Description",
"Fine-grained Description",
"arxiv:2303.15616",
"region:us"
] | 2023-11-26T07:50:06Z | 2023-11-24T10:53:16.000Z | 2023-11-24T10:53:16 | ---
license: apache-2.0
language:
- en
- zh
tags:
- FAVD
- FAVDBench
- Video Description
- Audio Description
- Audible Video Description
- Fine-grained Description
size_categories:
- 10K<n<100K
---
<div align="center">
<h1>
FAVDBench: Fine-grained Audible Video Description
</h1>
</div>
<p align="center">
🤗 <a href="https://huggingface.co/datasets/OpenNLPLab/FAVDBench" target="_blank">Hugging Face</a> •
🏠 <a href="https://github.com/OpenNLPLab/FAVDBench" target="_blank">GitHub</a> •
🤖 <a href="https://openxlab.org.cn/datasets/OpenNLPLab/FAVDBench" target="_blank">OpenDataLab</a> •
💬 <a href="https://forms.gle/5S3DWpBaV1UVczkf8" target="_blank">Apply Dataset</a>
</p>
[[`CVPR2023`]](https://openaccess.thecvf.com/content/CVPR2023/html/Shen_Fine-Grained_Audible_Video_Description_CVPR_2023_paper.html) [[`Project Page`]](http://www.avlbench.opennlplab.cn/papers/favd) [[`arXiv`]](https://arxiv.org/abs/2303.15616) [[`Demo`]](https://www.youtube.com/watch?v=iWJvTB-bTWk&ab_channel=OpenNLPLab)[[`BibTex`]](#Citation) [[`中文简介`]](https://mp.weixin.qq.com/s/_M57ZuOHH0UdwB6i9osqOA)
- [Introduction 简介](#introduction-简介)
- [Files 文件](#files-文件)
- [MD5 checksum](#md5-checksum)
- [Updates](#updates)
- [License](#license)
- [Citation](#citation)
## Introduction 简介
在CVPR2023中我们提出了精细化音视频描述任务(Fine-grained Audible Video Description, FAVD)该任务旨在提供有关可听视频的详细文本描述,包括每个对象的外观和空间位置、移动对象的动作以及视频中的声音。我们同是也为社区贡献了第一个精细化音视频描述数据集FAVDBench。对于每个视频片段,我们不仅提供一句话的视频概要,还提供4-6句描述视频的视觉细节和1-2个音频相关描述,且所有的标注都有中英文双语。
At CVPR2023, we introduced the task of Fine-grained Audible Video Description (FAVD). This task aims to provide detailed textual descriptions of audible videos, including the appearance and spatial positions of each object, the actions of moving objects, and the sounds within the video. Additionally, we contributed the first fine-grained audible video description dataset, FAVDBench, to the community. For each video segment, we offer not only a single-sentence video summary but also 4-6 sentences describing the visual details of the video and 1-2 audio-related descriptions, all annotated in both Chinese and English.
## Files 文件
* `meta`: metadata for raw videos
* `train`, `val`, `test`: train, val, test split
* `ytid`: youtube id
* `start`: vid segments starting time in seconds
* `end`: vid segments ending time in seconds
* `videos` , `audios` : raw video and audio segments
* `train` : train split
* `val`: validation split
* `test`: test split
* **📢📢📢 Please refer to [Apply Dataset](https://forms.gle/5S3DWpBaV1UVczkf8) to get raw video/audio data**
* `annotations_en.json` : annotated descirptions in English
* `id`: unique data (video segment) id
* `description`: audio-visual descriptioins
* `annotations_en.json` : annotated descirptions in Chinese
* `id`: unique data (video segment) id
* `cap`, `des`: audio-visual descriptioins
* `dcount`: count of descriptions
* `experiments`: expiermental files to replicate the results outlined in the paper.
* **📢📢📢 Please refer to [GitHub Repo](https://github.com/OpenNLPLab/FAVDBench) to get related data**
## MD5 checksum
| file | md5sum |
| :-------------------------: | :------------------------------: |
| `videos/train.zip` | 41ddad46ffac339cb0b65dffc02eda65 |
| `videos/val.zip` | 35291ad23944d67212c6e47b4cc6d619 |
| `videos/test.zip` | 07046d205837d2e3b1f65549fc1bc4d7 |
| `audios/train.zip` | 50cc83eebd84f85e9b86bbd2a7517f3f |
| `audios/val.zip` | 73995c5d1fcef269cc90be8a8ef6d917 |
| `audios/test.zip` | f72085feab6ca36060a0a073b31e8acc |
## Updates
**Latest Version: Jan 9, 2022. Public V0.1**
1. v0.1 <Jan 9, 2022>: initial publication
## License
The community usage of FAVDBench model & code requires adherence to [Apache 2.0](https://github.com/OpenNLPLab/FAVDBench/blob/main/LICENSE). The FAVDBench model & code supports commercial use.
## Citation
If you use FAVD or FAVDBench in your research, please use the following BibTeX entry.
```
@InProceedings{Shen_2023_CVPR,
author = {Shen, Xuyang and Li, Dong and Zhou, Jinxing and Qin, Zhen and He, Bowen and Han, Xiaodong and Li, Aixuan and Dai, Yuchao and Kong, Lingpeng and Wang, Meng and Qiao, Yu and Zhong, Yiran},
title = {Fine-Grained Audible Video Description},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2023},
pages = {10585-10596}
}
``` | [
-0.7402588725090027,
-0.9524984359741211,
0.13205565512180328,
0.28503182530403137,
-0.2493981271982193,
-0.5330920815467834,
-0.3065662384033203,
-0.16932106018066406,
0.22227589786052704,
0.2544538378715515,
-0.7296581268310547,
-0.5078955292701721,
-0.44425299763679504,
-0.3303733468055... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
bilallllllll/CNetImg2Img | bilallllllll | 2023-11-24T11:00:08Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T11:00:08Z | 2023-11-24T11:00:02.000Z | 2023-11-24T11:00:02 | ---
dataset_info:
features:
- name: id
dtype: int32
- name: input_image
dtype: image
- name: edit_pose
dtype: image
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 2992719.0
num_examples: 15
download_size: 2976836
dataset_size: 2992719.0
---
# Dataset Card for "CNetImg2Img-samples"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.6651491522789001,
-0.2733691930770874,
0.12682384252548218,
0.22275078296661377,
-0.27468934655189514,
-0.09808079153299332,
0.06390213966369629,
-0.18729469180107117,
0.7345869541168213,
0.5164593458175659,
-1.0618109703063965,
-0.9161279797554016,
-0.6721494197845459,
-0.4010634720325... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
mussso/kuroshiba_raizo | mussso | 2023-11-24T11:26:14Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T11:26:14Z | 2023-11-24T11:20:20.000Z | 2023-11-24T11:20:20 | kuroshiba raizo dataset
黒柴らいぞうのデータセット | [
-0.5005120038986206,
-0.3688671290874481,
0.3450925052165985,
0.33495819568634033,
-0.7290155291557312,
0.055095124989748,
-0.010121778585016727,
-0.09057313203811646,
0.709140419960022,
1.0177268981933594,
-0.8897587060928345,
-1.0859506130218506,
-0.8104191422462463,
0.17753742635250092,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
yeboyswag/yourmumjokes | yeboyswag | 2023-11-24T11:37:17Z | 0 | 0 | null | [
"size_categories:n<1K",
"language:en",
"joke",
"input",
"region:us"
] | 2023-11-24T11:37:17Z | 2023-11-24T11:25:01.000Z | 2023-11-24T11:25:01 | ---
language:
- en
size_categories:
- n<1K
tags:
- joke
- input
--- | [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Kinoxplus/CB01 | Kinoxplus | 2023-11-24T12:03:01Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T12:03:01Z | 2023-11-24T11:33:43.000Z | 2023-11-24T11:33:43 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
pihalf/gss-models | pihalf | 2023-11-24T11:42:42Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T11:42:42Z | 2023-11-24T11:42:42.000Z | 2023-11-24T11:42:42 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
FASOXO/COUI | FASOXO | 2023-11-24T11:43:07Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-24T11:43:07Z | 2023-11-24T11:43:07.000Z | 2023-11-24T11:43:07 | ---
license: openrail
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
AppleHarem/unicorn_azurlane | AppleHarem | 2023-11-24T11:50:12Z | 0 | 0 | null | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2023-11-24T11:50:12Z | 2023-11-24T11:49:47.000Z | 2023-11-24T11:49:47 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of unicorn (Azur Lane)
This is the dataset of unicorn (Azur Lane), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 522 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 597 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 522 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 522 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 323 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 597 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 597 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| [
-0.6477268934249878,
-0.4043126702308655,
0.2619076371192932,
0.3229046165943146,
-0.026107335463166237,
-0.1560661494731903,
0.39056339859962463,
-0.5647011399269104,
0.6950771808624268,
0.7732248902320862,
-0.9202058911323547,
-0.7934823632240295,
-0.42330071330070496,
0.3847290873527527... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
presencesw/esnli | presencesw | 2023-11-24T12:05:49Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T12:05:49Z | 2023-11-24T11:57:33.000Z | 2023-11-24T11:57:33 | ---
dataset_info:
features:
- name: question
dtype: string
- name: choice
dtype: string
- name: answer
dtype: string
- name: cot
dtype: string
splits:
- name: train
num_bytes: 11163841
num_examples: 36174
download_size: 4229630
dataset_size: 11163841
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
bkpfabiano/mashaurso | bkpfabiano | 2023-11-24T12:57:32Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T12:57:32Z | 2023-11-24T12:47:56.000Z | 2023-11-24T12:47:56 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622264862060547,
0.43461528420448303,
-0.52829909324646,
0.7012971639633179,
0.7915720343589783,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104477167129517,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
sahil2801/test_ga | sahil2801 | 2023-11-27T12:46:49Z | 0 | 0 | null | [
"region:us"
] | 2023-11-27T12:46:49Z | 2023-11-24T12:50:36.000Z | 2023-11-24T12:50:36 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622264862060547,
0.43461528420448303,
-0.52829909324646,
0.7012971639633179,
0.7915720343589783,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104477167129517,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.