datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
spiceai/drive_stats | ---
license: mit
---
|
open-llm-leaderboard/details_uukuguy__Mistral-7B-OpenOrca-lora | ---
pretty_name: Evaluation run of uukuguy/Mistral-7B-OpenOrca-lora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/Mistral-7B-OpenOrca-lora](https://huggingface.co/uukuguy/Mistral-7B-OpenOrca-lora)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__Mistral-7B-OpenOrca-lora_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-13T15:44:18.785582](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__Mistral-7B-OpenOrca-lora_public/blob/main/results_2023-11-13T15-44-18.785582.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6351832920969729,\n\
\ \"acc_stderr\": 0.03210898212657927,\n \"acc_norm\": 0.6445450507876114,\n\
\ \"acc_norm_stderr\": 0.03280393070910138,\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.015785370858396725,\n \"mc2\": 0.4274271734982197,\n\
\ \"mc2_stderr\": 0.014247308828610854,\n \"em\": 0.0019924496644295304,\n\
\ \"em_stderr\": 0.00045666764626669387,\n \"f1\": 0.06191694630872485,\n\
\ \"f1_stderr\": 0.0013823026381279647\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5742320819112628,\n \"acc_stderr\": 0.014449464278868807,\n\
\ \"acc_norm\": 0.6194539249146758,\n \"acc_norm_stderr\": 0.014188277712349814\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6357299342760406,\n\
\ \"acc_stderr\": 0.004802413919932666,\n \"acc_norm\": 0.8361880103565027,\n\
\ \"acc_norm_stderr\": 0.003693484894179416\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\"\
: 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n\
\ \"acc_stderr\": 0.024362599693031096,\n \"acc_norm\": 0.7580645161290323,\n\
\ \"acc_norm_stderr\": 0.024362599693031096\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\"\
: 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616258,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616258\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612927,\n \"\
acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612927\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.033922384053216174,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.033922384053216174\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159263,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159263\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911901,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911901\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294407003,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294407003\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3318435754189944,\n\
\ \"acc_stderr\": 0.015748421208187306,\n \"acc_norm\": 0.3318435754189944,\n\
\ \"acc_norm_stderr\": 0.015748421208187306\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537375,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537375\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45241199478487615,\n\
\ \"acc_stderr\": 0.012712265105889133,\n \"acc_norm\": 0.45241199478487615,\n\
\ \"acc_norm_stderr\": 0.012712265105889133\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093085,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093085\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.015785370858396725,\n \"mc2\": 0.4274271734982197,\n\
\ \"mc2_stderr\": 0.014247308828610854\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7908445146014207,\n \"acc_stderr\": 0.011430450045881575\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0019924496644295304,\n \
\ \"em_stderr\": 0.00045666764626669387,\n \"f1\": 0.06191694630872485,\n\
\ \"f1_stderr\": 0.0013823026381279647\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.1728582259287339,\n \"acc_stderr\": 0.010415432246200585\n\
\ }\n}\n```"
repo_url: https://huggingface.co/uukuguy/Mistral-7B-OpenOrca-lora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|drop|3_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-44-18.785582.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-44-18.785582.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- '**/details_harness|winogrande|5_2023-11-13T15-44-18.785582.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-13T15-44-18.785582.parquet'
- config_name: results
data_files:
- split: 2023_11_13T15_44_18.785582
path:
- results_2023-11-13T15-44-18.785582.parquet
- split: latest
path:
- results_2023-11-13T15-44-18.785582.parquet
---
# Dataset Card for Evaluation run of uukuguy/Mistral-7B-OpenOrca-lora
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/Mistral-7B-OpenOrca-lora
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/Mistral-7B-OpenOrca-lora](https://huggingface.co/uukuguy/Mistral-7B-OpenOrca-lora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__Mistral-7B-OpenOrca-lora_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-13T15:44:18.785582](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__Mistral-7B-OpenOrca-lora_public/blob/main/results_2023-11-13T15-44-18.785582.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6351832920969729,
"acc_stderr": 0.03210898212657927,
"acc_norm": 0.6445450507876114,
"acc_norm_stderr": 0.03280393070910138,
"mc1": 0.2839657282741738,
"mc1_stderr": 0.015785370858396725,
"mc2": 0.4274271734982197,
"mc2_stderr": 0.014247308828610854,
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626669387,
"f1": 0.06191694630872485,
"f1_stderr": 0.0013823026381279647
},
"harness|arc:challenge|25": {
"acc": 0.5742320819112628,
"acc_stderr": 0.014449464278868807,
"acc_norm": 0.6194539249146758,
"acc_norm_stderr": 0.014188277712349814
},
"harness|hellaswag|10": {
"acc": 0.6357299342760406,
"acc_stderr": 0.004802413919932666,
"acc_norm": 0.8361880103565027,
"acc_norm_stderr": 0.003693484894179416
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.02490699045899257,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.02490699045899257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031096,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031096
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616258,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616258
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612927,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612927
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.033922384053216174,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.033922384053216174
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.03157065078911901,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.03157065078911901
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407003,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407003
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.02418242749657761,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.02418242749657761
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3318435754189944,
"acc_stderr": 0.015748421208187306,
"acc_norm": 0.3318435754189944,
"acc_norm_stderr": 0.015748421208187306
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.024748624490537375,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.024748624490537375
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45241199478487615,
"acc_stderr": 0.012712265105889133,
"acc_norm": 0.45241199478487615,
"acc_norm_stderr": 0.012712265105889133
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093085,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093085
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2839657282741738,
"mc1_stderr": 0.015785370858396725,
"mc2": 0.4274271734982197,
"mc2_stderr": 0.014247308828610854
},
"harness|winogrande|5": {
"acc": 0.7908445146014207,
"acc_stderr": 0.011430450045881575
},
"harness|drop|3": {
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626669387,
"f1": 0.06191694630872485,
"f1_stderr": 0.0013823026381279647
},
"harness|gsm8k|5": {
"acc": 0.1728582259287339,
"acc_stderr": 0.010415432246200585
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
liesvarastranta/arxiv_cs_category | ---
license: cc-by-4.0
---
This is the same datasets as in Kaggle (https://www.kaggle.com/datasets/Cornell-University/arxiv)
Only selected for CS category |
open-llm-leaderboard/details_gemmathon__gemma-pro-3.1b-ko-v0.1 | ---
pretty_name: Evaluation run of gemmathon/gemma-pro-3.1b-ko-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gemmathon/gemma-pro-3.1b-ko-v0.1](https://huggingface.co/gemmathon/gemma-pro-3.1b-ko-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gemmathon__gemma-pro-3.1b-ko-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-08T13:29:55.724034](https://huggingface.co/datasets/open-llm-leaderboard/details_gemmathon__gemma-pro-3.1b-ko-v0.1/blob/main/results_2024-04-08T13-29-55.724034.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.39457199436905527,\n\
\ \"acc_stderr\": 0.034199970929033895,\n \"acc_norm\": 0.3985370625965584,\n\
\ \"acc_norm_stderr\": 0.03498053366386257,\n \"mc1\": 0.2215422276621787,\n\
\ \"mc1_stderr\": 0.014537867601301137,\n \"mc2\": 0.3491235555428734,\n\
\ \"mc2_stderr\": 0.01355886433946854\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4496587030716723,\n \"acc_stderr\": 0.014537144444284732,\n\
\ \"acc_norm\": 0.4709897610921502,\n \"acc_norm_stderr\": 0.014586776355294321\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5220075682135032,\n\
\ \"acc_stderr\": 0.004984945635998312,\n \"acc_norm\": 0.7042421828321052,\n\
\ \"acc_norm_stderr\": 0.00455449940929072\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3618421052631579,\n \"acc_stderr\": 0.03910525752849724,\n\
\ \"acc_norm\": 0.3618421052631579,\n \"acc_norm_stderr\": 0.03910525752849724\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.44528301886792454,\n \"acc_stderr\": 0.030588052974270655,\n\
\ \"acc_norm\": 0.44528301886792454,\n \"acc_norm_stderr\": 0.030588052974270655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4027777777777778,\n\
\ \"acc_stderr\": 0.04101405519842425,\n \"acc_norm\": 0.4027777777777778,\n\
\ \"acc_norm_stderr\": 0.04101405519842425\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.37572254335260113,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.37572254335260113,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149351,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149351\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.031778212502369216,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.031778212502369216\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4068965517241379,\n \"acc_stderr\": 0.04093793981266237,\n\
\ \"acc_norm\": 0.4068965517241379,\n \"acc_norm_stderr\": 0.04093793981266237\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.02306818884826112,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02306818884826112\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.040061680838488774,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.040061680838488774\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4483870967741935,\n \"acc_stderr\": 0.02829205683011273,\n \"\
acc_norm\": 0.4483870967741935,\n \"acc_norm_stderr\": 0.02829205683011273\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969565,\n \"\
acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969565\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03825460278380026,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03825460278380026\n },\n\
\ \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4797979797979798,\n\
\ \"acc_stderr\": 0.03559443565563919,\n \"acc_norm\": 0.4797979797979798,\n\
\ \"acc_norm_stderr\": 0.03559443565563919\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.5025906735751295,\n \"acc_stderr\": 0.03608390745384487,\n\
\ \"acc_norm\": 0.5025906735751295,\n \"acc_norm_stderr\": 0.03608390745384487\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3641025641025641,\n \"acc_stderr\": 0.02439667298509478,\n \
\ \"acc_norm\": 0.3641025641025641,\n \"acc_norm_stderr\": 0.02439667298509478\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.026335739404055803,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.026335739404055803\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.036030385453603854,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.036030385453603854\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5376146788990825,\n \"acc_stderr\": 0.02137657527439758,\n \"\
acc_norm\": 0.5376146788990825,\n \"acc_norm_stderr\": 0.02137657527439758\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2824074074074074,\n \"acc_stderr\": 0.030701372111510923,\n \"\
acc_norm\": 0.2824074074074074,\n \"acc_norm_stderr\": 0.030701372111510923\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.43137254901960786,\n \"acc_stderr\": 0.03476099060501637,\n \"\
acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.03476099060501637\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.379746835443038,\n \"acc_stderr\": 0.031591887529658504,\n \
\ \"acc_norm\": 0.379746835443038,\n \"acc_norm_stderr\": 0.031591887529658504\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.47085201793721976,\n\
\ \"acc_stderr\": 0.03350073248773404,\n \"acc_norm\": 0.47085201793721976,\n\
\ \"acc_norm_stderr\": 0.03350073248773404\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4198473282442748,\n \"acc_stderr\": 0.04328577215262972,\n\
\ \"acc_norm\": 0.4198473282442748,\n \"acc_norm_stderr\": 0.04328577215262972\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5537190082644629,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\"\
: 0.5537190082644629,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.047500773411999854,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.047500773411999854\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.36809815950920244,\n \"acc_stderr\": 0.03789213935838396,\n\
\ \"acc_norm\": 0.36809815950920244,\n \"acc_norm_stderr\": 0.03789213935838396\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.044642857142857116,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.044642857142857116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5825242718446602,\n \"acc_stderr\": 0.04882840548212238,\n\
\ \"acc_norm\": 0.5825242718446602,\n \"acc_norm_stderr\": 0.04882840548212238\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6367521367521367,\n\
\ \"acc_stderr\": 0.03150712523091264,\n \"acc_norm\": 0.6367521367521367,\n\
\ \"acc_norm_stderr\": 0.03150712523091264\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5312899106002554,\n\
\ \"acc_stderr\": 0.01784491809046855,\n \"acc_norm\": 0.5312899106002554,\n\
\ \"acc_norm_stderr\": 0.01784491809046855\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.37283236994219654,\n \"acc_stderr\": 0.02603389061357628,\n\
\ \"acc_norm\": 0.37283236994219654,\n \"acc_norm_stderr\": 0.02603389061357628\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574877,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574877\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.42483660130718953,\n \"acc_stderr\": 0.028304576673141114,\n\
\ \"acc_norm\": 0.42483660130718953,\n \"acc_norm_stderr\": 0.028304576673141114\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4212218649517685,\n\
\ \"acc_stderr\": 0.02804339985821063,\n \"acc_norm\": 0.4212218649517685,\n\
\ \"acc_norm_stderr\": 0.02804339985821063\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.027648477877413324,\n\
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.027648477877413324\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3120567375886525,\n \"acc_stderr\": 0.027640120545169927,\n \
\ \"acc_norm\": 0.3120567375886525,\n \"acc_norm_stderr\": 0.027640120545169927\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3005215123859192,\n\
\ \"acc_stderr\": 0.011709918883039114,\n \"acc_norm\": 0.3005215123859192,\n\
\ \"acc_norm_stderr\": 0.011709918883039114\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2610294117647059,\n \"acc_stderr\": 0.02667925227010312,\n\
\ \"acc_norm\": 0.2610294117647059,\n \"acc_norm_stderr\": 0.02667925227010312\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3758169934640523,\n \"acc_stderr\": 0.01959402113657745,\n \
\ \"acc_norm\": 0.3758169934640523,\n \"acc_norm_stderr\": 0.01959402113657745\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4636363636363636,\n\
\ \"acc_stderr\": 0.047764491623961985,\n \"acc_norm\": 0.4636363636363636,\n\
\ \"acc_norm_stderr\": 0.047764491623961985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4775510204081633,\n \"acc_stderr\": 0.03197694118713672,\n\
\ \"acc_norm\": 0.4775510204081633,\n \"acc_norm_stderr\": 0.03197694118713672\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4079601990049751,\n\
\ \"acc_stderr\": 0.034751163651940926,\n \"acc_norm\": 0.4079601990049751,\n\
\ \"acc_norm_stderr\": 0.034751163651940926\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5380116959064327,\n \"acc_stderr\": 0.03823727092882307,\n\
\ \"acc_norm\": 0.5380116959064327,\n \"acc_norm_stderr\": 0.03823727092882307\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2215422276621787,\n\
\ \"mc1_stderr\": 0.014537867601301137,\n \"mc2\": 0.3491235555428734,\n\
\ \"mc2_stderr\": 0.01355886433946854\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6503551696921863,\n \"acc_stderr\": 0.0134020736808505\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10841546626231995,\n \
\ \"acc_stderr\": 0.0085638525066275\n }\n}\n```"
repo_url: https://huggingface.co/gemmathon/gemma-pro-3.1b-ko-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|arc:challenge|25_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|arc:challenge|25_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|gsm8k|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|gsm8k|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hellaswag|10_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hellaswag|10_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T13-29-36.462126.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T13-29-55.724034.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T13-29-55.724034.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- '**/details_harness|winogrande|5_2024-04-08T13-29-36.462126.parquet'
- split: 2024_04_08T13_29_55.724034
path:
- '**/details_harness|winogrande|5_2024-04-08T13-29-55.724034.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-08T13-29-55.724034.parquet'
- config_name: results
data_files:
- split: 2024_04_08T13_29_36.462126
path:
- results_2024-04-08T13-29-36.462126.parquet
- split: 2024_04_08T13_29_55.724034
path:
- results_2024-04-08T13-29-55.724034.parquet
- split: latest
path:
- results_2024-04-08T13-29-55.724034.parquet
---
# Dataset Card for Evaluation run of gemmathon/gemma-pro-3.1b-ko-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gemmathon/gemma-pro-3.1b-ko-v0.1](https://huggingface.co/gemmathon/gemma-pro-3.1b-ko-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gemmathon__gemma-pro-3.1b-ko-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-08T13:29:55.724034](https://huggingface.co/datasets/open-llm-leaderboard/details_gemmathon__gemma-pro-3.1b-ko-v0.1/blob/main/results_2024-04-08T13-29-55.724034.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.39457199436905527,
"acc_stderr": 0.034199970929033895,
"acc_norm": 0.3985370625965584,
"acc_norm_stderr": 0.03498053366386257,
"mc1": 0.2215422276621787,
"mc1_stderr": 0.014537867601301137,
"mc2": 0.3491235555428734,
"mc2_stderr": 0.01355886433946854
},
"harness|arc:challenge|25": {
"acc": 0.4496587030716723,
"acc_stderr": 0.014537144444284732,
"acc_norm": 0.4709897610921502,
"acc_norm_stderr": 0.014586776355294321
},
"harness|hellaswag|10": {
"acc": 0.5220075682135032,
"acc_stderr": 0.004984945635998312,
"acc_norm": 0.7042421828321052,
"acc_norm_stderr": 0.00455449940929072
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3618421052631579,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.3618421052631579,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44528301886792454,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.44528301886792454,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.04101405519842425,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.04101405519842425
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.37572254335260113,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.37572254335260113,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149351,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149351
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.031778212502369216,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.031778212502369216
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4068965517241379,
"acc_stderr": 0.04093793981266237,
"acc_norm": 0.4068965517241379,
"acc_norm_stderr": 0.04093793981266237
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02306818884826112,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02306818884826112
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.040061680838488774,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.040061680838488774
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4483870967741935,
"acc_stderr": 0.02829205683011273,
"acc_norm": 0.4483870967741935,
"acc_norm_stderr": 0.02829205683011273
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969565,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969565
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4,
"acc_stderr": 0.03825460278380026,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03825460278380026
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4797979797979798,
"acc_stderr": 0.03559443565563919,
"acc_norm": 0.4797979797979798,
"acc_norm_stderr": 0.03559443565563919
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5025906735751295,
"acc_stderr": 0.03608390745384487,
"acc_norm": 0.5025906735751295,
"acc_norm_stderr": 0.03608390745384487
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3641025641025641,
"acc_stderr": 0.02439667298509478,
"acc_norm": 0.3641025641025641,
"acc_norm_stderr": 0.02439667298509478
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.026335739404055803,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.026335739404055803
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.036030385453603854,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.036030385453603854
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5376146788990825,
"acc_stderr": 0.02137657527439758,
"acc_norm": 0.5376146788990825,
"acc_norm_stderr": 0.02137657527439758
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2824074074074074,
"acc_stderr": 0.030701372111510923,
"acc_norm": 0.2824074074074074,
"acc_norm_stderr": 0.030701372111510923
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.03476099060501637,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.03476099060501637
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.379746835443038,
"acc_stderr": 0.031591887529658504,
"acc_norm": 0.379746835443038,
"acc_norm_stderr": 0.031591887529658504
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.47085201793721976,
"acc_stderr": 0.03350073248773404,
"acc_norm": 0.47085201793721976,
"acc_norm_stderr": 0.03350073248773404
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4198473282442748,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.4198473282442748,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5537190082644629,
"acc_stderr": 0.0453793517794788,
"acc_norm": 0.5537190082644629,
"acc_norm_stderr": 0.0453793517794788
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.047500773411999854,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.047500773411999854
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.36809815950920244,
"acc_stderr": 0.03789213935838396,
"acc_norm": 0.36809815950920244,
"acc_norm_stderr": 0.03789213935838396
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.044642857142857116,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.044642857142857116
},
"harness|hendrycksTest-management|5": {
"acc": 0.5825242718446602,
"acc_stderr": 0.04882840548212238,
"acc_norm": 0.5825242718446602,
"acc_norm_stderr": 0.04882840548212238
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6367521367521367,
"acc_stderr": 0.03150712523091264,
"acc_norm": 0.6367521367521367,
"acc_norm_stderr": 0.03150712523091264
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5312899106002554,
"acc_stderr": 0.01784491809046855,
"acc_norm": 0.5312899106002554,
"acc_norm_stderr": 0.01784491809046855
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.37283236994219654,
"acc_stderr": 0.02603389061357628,
"acc_norm": 0.37283236994219654,
"acc_norm_stderr": 0.02603389061357628
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574877,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574877
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.42483660130718953,
"acc_stderr": 0.028304576673141114,
"acc_norm": 0.42483660130718953,
"acc_norm_stderr": 0.028304576673141114
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4212218649517685,
"acc_stderr": 0.02804339985821063,
"acc_norm": 0.4212218649517685,
"acc_norm_stderr": 0.02804339985821063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.027648477877413324,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.027648477877413324
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3120567375886525,
"acc_stderr": 0.027640120545169927,
"acc_norm": 0.3120567375886525,
"acc_norm_stderr": 0.027640120545169927
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3005215123859192,
"acc_stderr": 0.011709918883039114,
"acc_norm": 0.3005215123859192,
"acc_norm_stderr": 0.011709918883039114
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2610294117647059,
"acc_stderr": 0.02667925227010312,
"acc_norm": 0.2610294117647059,
"acc_norm_stderr": 0.02667925227010312
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3758169934640523,
"acc_stderr": 0.01959402113657745,
"acc_norm": 0.3758169934640523,
"acc_norm_stderr": 0.01959402113657745
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4636363636363636,
"acc_stderr": 0.047764491623961985,
"acc_norm": 0.4636363636363636,
"acc_norm_stderr": 0.047764491623961985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4775510204081633,
"acc_stderr": 0.03197694118713672,
"acc_norm": 0.4775510204081633,
"acc_norm_stderr": 0.03197694118713672
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4079601990049751,
"acc_stderr": 0.034751163651940926,
"acc_norm": 0.4079601990049751,
"acc_norm_stderr": 0.034751163651940926
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5380116959064327,
"acc_stderr": 0.03823727092882307,
"acc_norm": 0.5380116959064327,
"acc_norm_stderr": 0.03823727092882307
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2215422276621787,
"mc1_stderr": 0.014537867601301137,
"mc2": 0.3491235555428734,
"mc2_stderr": 0.01355886433946854
},
"harness|winogrande|5": {
"acc": 0.6503551696921863,
"acc_stderr": 0.0134020736808505
},
"harness|gsm8k|5": {
"acc": 0.10841546626231995,
"acc_stderr": 0.0085638525066275
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Dahoas/cot_gsm8k | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 7710945
num_examples: 7217
- name: val
num_bytes: 267770
num_examples: 256
- name: test
num_bytes: 1436697
num_examples: 1319
download_size: 5472201
dataset_size: 9415412
---
# Dataset Card for "cot_gsm8k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alwanrahmana/labeled-20-data-conll | ---
license: unknown
---
|
ed001/ds-coder-instruct-v1 | ---
task_categories:
- text-generation
- conversational
- text2text-generation
language:
- en
tags:
- code
- machine learning
- deep learning
- data science
pretty_name: Data Science Coder
size_categories:
- 10K<n<100K
configs:
- config_name: default
data_files:
- split: train
path: ds_coder.jsonl
license: cc-by-nc-sa-4.0
---
# Dataset Card for DS Coder Instruct Dataset
<!-- Provide a quick summary of the dataset. -->
DS Coder is a dataset for instruction fine tuning of language models. It is a specialized dataset focusing only on
data science (eg. plotting, data wrangling, machine learnig models, deep learning, and numerical computations). The dataset contains code examples both in R and Python.
The goal of this dataset is to enable creation of small-scale, specialized language model assistants for data science projects.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
DS Coder instruct dataset contains *(input, instruction, output)* triplets. Instruction provides a task in the data science domain and output contains the code to solve the task.
Where available, it also contains *text* field holding Alpaca style input. Metadata, such as the programming language *(lang)* and topics *(topics)* are provided.
*topics* lists the concepts used in the code (eg. ML, neural networs, plotting, etc.). This is determined based on which kinds of libraries the code uses. This field can be used
to obtain subset of data for specific tasks, such as data vizualisation.
Additionally, the original data source is provided under the *dataset* field.
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
DS Coder is filtered and preprocessed from a collection of publically available datasets on HuggingFace. All the sources all liste below with their corresponding links.
- **nickrosh/Evol-Instruct-Code-80k-v1:** https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1
- **TokenBender/code_instructions_122k_alpaca_style:** https://huggingface.co/datasets/TokenBender/code_instructions_122k_alpaca_style
- **theblackcat102/evol-codealpaca-v1:** https://huggingface.co/datasets/theblackcat102/evol-codealpaca-v1
- **ise-uiuc/Magicoder-OSS-Instruct-75K:** https://huggingface.co/datasets/ise-uiuc/Magicoder-OSS-Instruct-75K
Please make sure to cite the above mentioned source when using this dataset. You should visit these pages and look for specific usage instructions, if any.
## Dataset Creation
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
DS Coder was created by filtering and processing existing public datasets of *(instruction, code)* pairs. Source data was filtered to keep only code related to data science
applications. The filtering was done using regex to gather code that uses popular data science libraries (eg. Matplotlib, Sklearn, PyTorch, etc.) in Python and R.
Then, the data is further processed to filter out samples with very long or very short code. Code outputs with lots of comments and low amound of code were filtered out.
Additionally, samples with very long and very short instructions were also removed.
After filtering, exact deduplication based on output code and input instruction was performed. After this process, roughly *16K* samples remain.
More specific description dataset processing is provided below.
### Filtering
The first step of the filtering process is to gather all samples from source datasets that have code related to a data science application. To do so, regex filtering was
applied to the *code* and *instruction* to filter out such samples. Regex filters mainly look for imports and usage of popular data science libraries, such as Pandas or PyTorch.
Data science code in Python as well as R are gathered.
After gathering relevant code samples, further filtering based on line length, instruction length, alphanumeric ratio, and comment to code ratio are performed.
Code filtering is similar to [BigCode](https://github.com/bigcode-project/bigcode-dataset). Code filtering parameters shown below are derived from there.
This stage ensures that short, very, long and uninformative samples are removed. The script for filtering can be found in this repo
[Ea0011/wrangler](https://github.com/Ea0011/wrangler). You may use the filtering script to process additional datasets or tweak the params.
Parameters for filtering are listed below:
- **line_max**: Maximum line length allowed is 1000 characters.
- **line_mean**: Maximum mean line length allowed is 100 characters.
- **alpha_frac**: Minimum fraction of alphanumeric characters allowed is 25%.
- **min_inst_size**: Minimum instruction size in words is 5 words.
- **max_inst_size**: Maximum instruction size in words is 1000 words.
- **max_threshold_comments**: Maximum threshold for comment to code ratio is 80%.
- **min_threshold_comments**: Minimum threshold for comment to code ratio is 1%.
## Data Analysis
This section provides some analysis of the dataset. Code lengths, language distribution as well as distribution of data science tasks are shown. Topic distribution shows
distribution of concepts used in the code. Some domains, such as plotting are underrepresanted compared to others. You may use the topics column to select samples for specific tasks.
<img src="lang_dist.png" width="60%"/>
<img src="ds_dist.png" width="60%" />
<img src="inst_len_total.png" width="60%"/>
<img src="topics.png" width="60%" />
As there are data points from several data sources, it is also worth showing distributions across samples from different datasets. As it can be seen, some sources
contain short and concise samples while others contain verbose samples. Use this information to choose specific data source if needed.
<img src="code_len.png" width="60%"/>
<img src="inst_len.png" width="60%" />
## Dataset Card Contact
For any suggestions and concerns please reach out to me: [Ea0011](https://github.com/Ea0011/) |
yangwang825/reuters-21578 | ---
task_categories:
- text-classification
language:
- en
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': acq
'1': crude
'2': earn
'3': grain
'4': interest
'5': money-fx
'6': ship
'7': trade
---
`yangwang825/reuters-21578` is an 8-class subset of the Reuters 21578 news dataset.
|
erdometo/tquad2 | ---
license: other
license_name: tquad2
license_link: https://huggingface.co/datasets/husnu/tquad2
---
|
suriyagunasekar/stackoverflow-python-with-meta-data | ---
dataset_info:
features:
- name: content
dtype: string
- name: title
dtype: string
- name: question
dtype: string
- name: answers
sequence: string
- name: answers_scores
sequence: int32
- name: non_answers
sequence: string
- name: non_answers_scores
sequence: int32
- name: tags
sequence: string
- name: name
dtype: string
splits:
- name: train
num_bytes: 9114535831
num_examples: 1745972
download_size: 4753108665
dataset_size: 9114535831
---
# Dataset Card for "stackoverflow-python-with-meta-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
elzooz/Amod_mental_health_counseling_conversations | ---
license: openrail
---
|
Indic-Benchmark/bengali-arc-c-2.5k | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
struct:
- name: choices
list:
- name: label
dtype: string
- name: text
dtype: string
- name: stem
dtype: string
- name: answerKey
dtype: string
splits:
- name: train
num_bytes: 1822128
num_examples: 2572
download_size: 685372
dataset_size: 1822128
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ej2/sam-50-336 | ---
license: mit
---
|
datahrvoje/twitter_dataset_1713179739 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 26440
num_examples: 63
download_size: 13243
dataset_size: 26440
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ibranze/araproje_hellaswag_en_conf_llama_nearestscore_true | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 149738.0
num_examples: 250
download_size: 81112
dataset_size: 149738.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_en_conf_llama_nearestscore_true"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
v-xchen-v/agieval_eng_cloze | ---
license: mit
---
|
CyberHarem/gray_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of gray/グレイ/格蕾 (Fate/Grand Order)
This is the dataset of gray/グレイ/格蕾 (Fate/Grand Order), containing 465 images and their tags.
The core tags of this character are `grey_hair, green_eyes, short_hair, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 465 | 602.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gray_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 465 | 531.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gray_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1132 | 1.00 GiB | [Download](https://huggingface.co/datasets/CyberHarem/gray_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gray_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | fur_trim, hood_up, 2girls, cloak, long_hair, long_sleeves, blonde_hair, blush, solo_focus, open_mouth, capelet, sweatdrop |
| 1 | 22 |  |  |  |  |  | 1girl, black_thighhighs, cloak, fur_trim, pleated_skirt, solo, cape, hood_up, plaid_skirt, long_sleeves, black_gloves, grey_skirt, zettai_ryouiki, looking_at_viewer, miniskirt, black_footwear, simple_background, white_background, boots, standing, full_body, closed_mouth, jacket, holding |
| 2 | 14 |  |  |  |  |  | 1girl, fur_trim, hood_up, looking_at_viewer, solo, hooded_cloak, long_sleeves, black_dress, white_ribbon, closed_mouth, blush, capelet, holding, smile, white_background |
| 3 | 5 |  |  |  |  |  | 1girl, fur_trim, hood_up, hooded_cloak, looking_at_viewer, upper_body, cape, closed_mouth, simple_background, solo, white_background |
| 4 | 6 |  |  |  |  |  | 1girl, cloak, fur_trim, hood_up, parted_lips, portrait, simple_background, solo, white_background, looking_at_viewer |
| 5 | 5 |  |  |  |  |  | 1girl, black_gloves, hood_up, hooded_cloak, solo, upper_body, birdcage, cape, fur-trimmed_cloak, looking_at_viewer, parted_lips, holding_lantern, long_sleeves, black_cloak, white_hair |
| 6 | 7 |  |  |  |  |  | 1girl, bare_shoulders, collarbone, looking_at_viewer, ahoge, braid, hair_ribbon, short_sleeves, solo, alternate_costume, blush, earrings, black_ribbon, blue_dress, blue_eyes, hair_bun, simple_background, upper_body, off-shoulder_dress, parted_lips, sidelocks, white_belt |
| 7 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, outdoors, solo, medium_breasts, ocean, ahoge, beach, blue_sky, collarbone, day, navel, cloud, hood, long_sleeves, open_jacket, open_mouth, sidelocks, white_bikini, black_bikini, braid, cleavage, closed_mouth, small_breasts, smile, stomach |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | fur_trim | hood_up | 2girls | cloak | long_hair | long_sleeves | blonde_hair | blush | solo_focus | open_mouth | capelet | sweatdrop | 1girl | black_thighhighs | pleated_skirt | solo | cape | plaid_skirt | black_gloves | grey_skirt | zettai_ryouiki | looking_at_viewer | miniskirt | black_footwear | simple_background | white_background | boots | standing | full_body | closed_mouth | jacket | holding | hooded_cloak | black_dress | white_ribbon | smile | upper_body | parted_lips | portrait | birdcage | fur-trimmed_cloak | holding_lantern | black_cloak | white_hair | bare_shoulders | collarbone | ahoge | braid | hair_ribbon | short_sleeves | alternate_costume | earrings | black_ribbon | blue_dress | blue_eyes | hair_bun | off-shoulder_dress | sidelocks | white_belt | outdoors | medium_breasts | ocean | beach | blue_sky | day | navel | cloud | hood | open_jacket | white_bikini | black_bikini | cleavage | small_breasts | stomach |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------|:----------|:---------|:--------|:------------|:---------------|:--------------|:--------|:-------------|:-------------|:----------|:------------|:--------|:-------------------|:----------------|:-------|:-------|:--------------|:---------------|:-------------|:-----------------|:--------------------|:------------|:-----------------|:--------------------|:-------------------|:--------|:-----------|:------------|:---------------|:---------|:----------|:---------------|:--------------|:---------------|:--------|:-------------|:--------------|:-----------|:-----------|:--------------------|:------------------|:--------------|:-------------|:-----------------|:-------------|:--------|:--------|:--------------|:----------------|:--------------------|:-----------|:---------------|:-------------|:------------|:-----------|:---------------------|:------------|:-------------|:-----------|:-----------------|:--------|:--------|:-----------|:------|:--------|:--------|:-------|:--------------|:---------------|:---------------|:-----------|:----------------|:----------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 22 |  |  |  |  |  | X | X | | X | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | X | | | | X | | X | | | X | | X | | | X | | | | | | X | | | | X | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | | | | | | | | | | X | | | X | X | | | | | X | | | X | X | | | | X | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | | X | | | | | | | | | X | | | X | | | | | | X | | | X | X | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | | X | | | | X | | | | | | | X | | | X | X | | X | | | X | | | | | | | | | | | X | | | | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | | | | | | | | X | | | | | X | | | X | | | | | | X | | | X | | | | | | | | | | | | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | | | | | | X | | X | | X | | | X | | | X | | | | | | X | | | | | | | | X | | | | | | X | | | | | | | | | | X | X | X | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
roleplay4fun/aesir-v1.1 | ---
dataset_info:
features:
- name: bot_name
dtype: string
- name: user_name
dtype: string
- name: persona
dtype: string
- name: multi_personas
sequence: 'null'
- name: demos
dtype: string
- name: scenario
dtype: string
- name: first_message
dtype: string
- name: tags
sequence: string
- name: source
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 10157003.22
num_examples: 980
download_size: 5921224
dataset_size: 10157003.22
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
msaad02/preformat-ss-cleaned-brockport-qa | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1633650
num_examples: 7098
download_size: 0
dataset_size: 1633650
---
# Dataset Card for "preformat-ss-cleaned-brockport-qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_jondurbin__airoboros-c34b-2.2.1 | ---
pretty_name: Evaluation run of jondurbin/airoboros-c34b-2.2.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-c34b-2.2.1](https://huggingface.co/jondurbin/airoboros-c34b-2.2.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-c34b-2.2.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-27T01:27:27.321829](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-c34b-2.2.1/blob/main/results_2023-10-27T01-27-27.321829.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.020343959731543623,\n\
\ \"em_stderr\": 0.0014457533435412132,\n \"f1\": 0.08045826342281848,\n\
\ \"f1_stderr\": 0.0019040548084027437,\n \"acc\": 0.4627435340326824,\n\
\ \"acc_stderr\": 0.011782817513813687\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.020343959731543623,\n \"em_stderr\": 0.0014457533435412132,\n\
\ \"f1\": 0.08045826342281848,\n \"f1_stderr\": 0.0019040548084027437\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2001516300227445,\n \
\ \"acc_stderr\": 0.011021119022510194\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7253354380426204,\n \"acc_stderr\": 0.01254451600511718\n\
\ }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-c34b-2.2.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|arc:challenge|25_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_27T01_27_27.321829
path:
- '**/details_harness|drop|3_2023-10-27T01-27-27.321829.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-27T01-27-27.321829.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_27T01_27_27.321829
path:
- '**/details_harness|gsm8k|5_2023-10-27T01-27-27.321829.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-27T01-27-27.321829.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hellaswag|10_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T18-58-09.868261.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T18-58-09.868261.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T18-58-09.868261.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_27T01_27_27.321829
path:
- '**/details_harness|winogrande|5_2023-10-27T01-27-27.321829.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-27T01-27-27.321829.parquet'
- config_name: results
data_files:
- split: 2023_10_01T18_58_09.868261
path:
- results_2023-10-01T18-58-09.868261.parquet
- split: 2023_10_27T01_27_27.321829
path:
- results_2023-10-27T01-27-27.321829.parquet
- split: latest
path:
- results_2023-10-27T01-27-27.321829.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-c34b-2.2.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-c34b-2.2.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-c34b-2.2.1](https://huggingface.co/jondurbin/airoboros-c34b-2.2.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-c34b-2.2.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-27T01:27:27.321829](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-c34b-2.2.1/blob/main/results_2023-10-27T01-27-27.321829.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.020343959731543623,
"em_stderr": 0.0014457533435412132,
"f1": 0.08045826342281848,
"f1_stderr": 0.0019040548084027437,
"acc": 0.4627435340326824,
"acc_stderr": 0.011782817513813687
},
"harness|drop|3": {
"em": 0.020343959731543623,
"em_stderr": 0.0014457533435412132,
"f1": 0.08045826342281848,
"f1_stderr": 0.0019040548084027437
},
"harness|gsm8k|5": {
"acc": 0.2001516300227445,
"acc_stderr": 0.011021119022510194
},
"harness|winogrande|5": {
"acc": 0.7253354380426204,
"acc_stderr": 0.01254451600511718
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-international_law-neg-prepend-fix | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 7660
num_examples: 5
- name: test
num_bytes: 467360
num_examples: 121
download_size: 15537
dataset_size: 475020
---
# Dataset Card for "mmlu-international_law-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HuggingFaceM4/mmbench_support_query_sets | Invalid username or password. |
a-asad/sharifQuAD | ---
license: apache-2.0
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 1018017
num_examples: 349
download_size: 97151
dataset_size: 1018017
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/hmg21_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hmg21/HMG21/HK21 (Girls' Frontline)
This is the dataset of hmg21/HMG21/HK21 (Girls' Frontline), containing 37 images and their tags.
The core tags of this character are `long_hair, hair_over_one_eye, twintails, bangs, glasses, round_eyewear, purple_eyes, breasts, grey_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 37 | 39.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hmg21_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 37 | 26.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hmg21_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 80 | 50.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hmg21_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 37 | 36.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hmg21_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 80 | 66.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hmg21_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hmg21_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, solo, red_socks, boots, looking_at_viewer, simple_background, long_sleeves, belt, closed_mouth, cross, jacket, knee_pads, white_background, full_body, blush, gun, scarf, black_gloves, sitting, buckle, open_coat, sleeves_past_wrists, fur_trim, hair_ornament, holding, leotard, necklace, red_footwear, single_kneehigh, striped |
| 1 | 6 |  |  |  |  |  | 1girl, simple_background, solo, upper_body, bare_shoulders, closed_mouth, cross, looking_at_viewer, white_background, blonde_hair, blush, ponytail, sleeveless |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | red_socks | boots | looking_at_viewer | simple_background | long_sleeves | belt | closed_mouth | cross | jacket | knee_pads | white_background | full_body | blush | gun | scarf | black_gloves | sitting | buckle | open_coat | sleeves_past_wrists | fur_trim | hair_ornament | holding | leotard | necklace | red_footwear | single_kneehigh | striped | upper_body | bare_shoulders | blonde_hair | ponytail | sleeveless |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:------------|:--------|:--------------------|:--------------------|:---------------|:-------|:---------------|:--------|:---------|:------------|:-------------------|:------------|:--------|:------|:--------|:---------------|:----------|:---------|:------------|:----------------------|:-----------|:----------------|:----------|:----------|:-----------|:---------------|:------------------|:----------|:-------------|:-----------------|:--------------|:-----------|:-------------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | | | X | X | | | X | X | | | X | | X | | | | | | | | | | | | | | | | X | X | X | X | X |
|
nitinbhayana/dabur_361 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 140926
num_examples: 1037
- name: test
num_bytes: 63802
num_examples: 462
download_size: 98490
dataset_size: 204728
---
# Dataset Card for "dabur_361"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/87014f38 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1339
dataset_size: 182
---
# Dataset Card for "87014f38"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_242 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 971718540.0
num_examples: 189345
download_size: 995248480
dataset_size: 971718540.0
---
# Dataset Card for "chunk_242"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zzsi/flying_mnist | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
dataset_info:
features:
- name: video
sequence:
sequence:
sequence:
sequence: float32
splits:
- name: train
num_bytes: 4993216000
num_examples: 1000
- name: val
num_bytes: 499321600
num_examples: 100
download_size: 154449837
dataset_size: 5492537600
---
|
WeekendAI/covidQnA | ---
license: openrail
---
|
Kotiralla/mj8 | ---
license: lppl-1.3c
---
|
galman33/gal_yair_8300_1664x832_fixed | ---
dataset_info:
features:
- name: lat
dtype: float64
- name: lon
dtype: float64
- name: country_code
dtype:
class_label:
names:
'0': ad
'1': ae
'2': al
'3': aq
'4': ar
'5': au
'6': bd
'7': be
'8': bg
'9': bm
'10': bo
'11': br
'12': bt
'13': bw
'14': ca
'15': ch
'16': cl
'17': co
'18': cz
'19': de
'20': dk
'21': ec
'22': ee
'23': es
'24': fi
'25': fr
'26': gb
'27': gh
'28': gl
'29': gr
'30': gt
'31': hk
'32': hr
'33': hu
'34': id
'35': ie
'36': il
'37': is
'38': it
'39': ix
'40': jp
'41': kg
'42': kh
'43': kr
'44': la
'45': lk
'46': ls
'47': lt
'48': lu
'49': lv
'50': me
'51': mg
'52': mk
'53': mn
'54': mo
'55': mt
'56': mx
'57': my
'58': nl
'59': 'no'
'60': nz
'61': pe
'62': ph
'63': pl
'64': pt
'65': ro
'66': rs
'67': ru
'68': se
'69': sg
'70': si
'71': sk
'72': sn
'73': sz
'74': th
'75': tn
'76': tr
'77': tw
'78': ua
'79': ug
'80': us
'81': uy
'82': za
- name: image
dtype: image
splits:
- name: train
num_bytes: 1412450526.5
num_examples: 8300
download_size: 1411162723
dataset_size: 1412450526.5
---
# Dataset Card for "gal_yair_8300_1664x832_fixed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
burtenshaw/DIBT_prompts_ranked_synthetic_gpt35 | ---
dataset_info:
features:
- name: input
dtype: string
- name: quality
list:
- name: status
dtype: string
- name: user_id
dtype: string
- name: value
dtype: string
- name: metadata
dtype: string
- name: avg_rating
dtype: float64
- name: num_responses
dtype: int64
- name: agreement_ratio
dtype: float64
- name: raw_responses
sequence: int64
- name: kind
dtype: string
- name: cluster_description
dtype: string
- name: topic
dtype: string
- name: generation_model
sequence: string
- name: generation_prompt
list:
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_generation_responses
sequence: string
- name: rating
sequence: float64
- name: rationale
sequence: string
- name: generations
dtype: 'null'
splits:
- name: train
num_bytes: 27489773
num_examples: 10331
download_size: 8341186
dataset_size: 27489773
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
fathyshalab/massive_audio | ---
dataset_info:
features:
- name: id
dtype: string
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 12795
num_examples: 290
- name: validation
num_bytes: 1499
num_examples: 35
- name: test
num_bytes: 2615
num_examples: 62
download_size: 12219
dataset_size: 16909
---
# Dataset Card for "massive_audio"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlFrauch/math_wolfram_sciformer_tokenizer | ---
dataset_info:
features:
- name: tokens
sequence:
sequence: int64
splits:
- name: train
num_bytes: 3721244000
num_examples: 4605500
download_size: 151837798
dataset_size: 3721244000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Rakshit122/za1aaaa11 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: string
splits:
- name: train
num_bytes: 46270
num_examples: 226
download_size: 16707
dataset_size: 46270
---
# Dataset Card for "za1aaaa11"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_nbeerbower__flammen11X-mistral-7B | ---
pretty_name: Evaluation run of nbeerbower/flammen11X-mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nbeerbower/flammen11X-mistral-7B](https://huggingface.co/nbeerbower/flammen11X-mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nbeerbower__flammen11X-mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-24T20:46:02.580138](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__flammen11X-mistral-7B/blob/main/results_2024-03-24T20-46-02.580138.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6529584732368889,\n\
\ \"acc_stderr\": 0.032100034768875296,\n \"acc_norm\": 0.6531920184635167,\n\
\ \"acc_norm_stderr\": 0.03275827192418312,\n \"mc1\": 0.5361077111383109,\n\
\ \"mc1_stderr\": 0.017457800422268622,\n \"mc2\": 0.70183813827314,\n\
\ \"mc2_stderr\": 0.014808801040998189\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6928327645051194,\n \"acc_stderr\": 0.013481034054980943,\n\
\ \"acc_norm\": 0.71160409556314,\n \"acc_norm_stderr\": 0.013238394422428175\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7135032861979685,\n\
\ \"acc_stderr\": 0.0045120024597579585,\n \"acc_norm\": 0.8822943636725752,\n\
\ \"acc_norm_stderr\": 0.0032160063577603803\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"\
acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465066,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465066\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163248,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163248\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834834,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834834\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n\
\ \"acc_stderr\": 0.01654788799741611,\n \"acc_norm\": 0.42793296089385474,\n\
\ \"acc_norm_stderr\": 0.01654788799741611\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"\
acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n\
\ \"acc_stderr\": 0.01274724896707907,\n \"acc_norm\": 0.470013037809648,\n\
\ \"acc_norm_stderr\": 0.01274724896707907\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5361077111383109,\n\
\ \"mc1_stderr\": 0.017457800422268622,\n \"mc2\": 0.70183813827314,\n\
\ \"mc2_stderr\": 0.014808801040998189\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8216258879242304,\n \"acc_stderr\": 0.01075935201485594\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6580742987111448,\n \
\ \"acc_stderr\": 0.013066089625182813\n }\n}\n```"
repo_url: https://huggingface.co/nbeerbower/flammen11X-mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|arc:challenge|25_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|gsm8k|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hellaswag|10_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T20-46-02.580138.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T20-46-02.580138.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- '**/details_harness|winogrande|5_2024-03-24T20-46-02.580138.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-24T20-46-02.580138.parquet'
- config_name: results
data_files:
- split: 2024_03_24T20_46_02.580138
path:
- results_2024-03-24T20-46-02.580138.parquet
- split: latest
path:
- results_2024-03-24T20-46-02.580138.parquet
---
# Dataset Card for Evaluation run of nbeerbower/flammen11X-mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nbeerbower/flammen11X-mistral-7B](https://huggingface.co/nbeerbower/flammen11X-mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nbeerbower__flammen11X-mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-24T20:46:02.580138](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__flammen11X-mistral-7B/blob/main/results_2024-03-24T20-46-02.580138.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6529584732368889,
"acc_stderr": 0.032100034768875296,
"acc_norm": 0.6531920184635167,
"acc_norm_stderr": 0.03275827192418312,
"mc1": 0.5361077111383109,
"mc1_stderr": 0.017457800422268622,
"mc2": 0.70183813827314,
"mc2_stderr": 0.014808801040998189
},
"harness|arc:challenge|25": {
"acc": 0.6928327645051194,
"acc_stderr": 0.013481034054980943,
"acc_norm": 0.71160409556314,
"acc_norm_stderr": 0.013238394422428175
},
"harness|hellaswag|10": {
"acc": 0.7135032861979685,
"acc_stderr": 0.0045120024597579585,
"acc_norm": 0.8822943636725752,
"acc_norm_stderr": 0.0032160063577603803
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465066,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465066
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163248,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163248
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179326,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834834,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834834
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42793296089385474,
"acc_stderr": 0.01654788799741611,
"acc_norm": 0.42793296089385474,
"acc_norm_stderr": 0.01654788799741611
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.01274724896707907,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.01274724896707907
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5361077111383109,
"mc1_stderr": 0.017457800422268622,
"mc2": 0.70183813827314,
"mc2_stderr": 0.014808801040998189
},
"harness|winogrande|5": {
"acc": 0.8216258879242304,
"acc_stderr": 0.01075935201485594
},
"harness|gsm8k|5": {
"acc": 0.6580742987111448,
"acc_stderr": 0.013066089625182813
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lonestar108/sadness | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validate
path: data/validate-*
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 7274
num_examples: 23
- name: test
num_bytes: 3112
num_examples: 9
- name: validate
num_bytes: 733
num_examples: 3
download_size: 13174
dataset_size: 11119
---
# Dataset Card for "new_sadness"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CAiRE/prosocial-dialog-ind_Latn | ---
dataset_info:
features:
- name: context
dtype: string
- name: response
dtype: string
- name: rots
sequence: string
- name: safety_label
dtype: string
- name: safety_annotations
sequence: string
- name: safety_annotation_reasons
sequence: string
- name: source
dtype: string
- name: etc
dtype: string
- name: dialogue_id
dtype: int64
- name: response_id
dtype: int64
- name: episode_done
dtype: bool
- name: mt_context
dtype: string
splits:
- name: train
num_bytes: 78260432
num_examples: 120236
- name: validation
num_bytes: 13290287
num_examples: 20416
- name: test
num_bytes: 16250460
num_examples: 25029
download_size: 49405775
dataset_size: 107801179
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Tonic/EasyReddit | ---
license: mit
language:
- en
tags:
- not-for-all-audiences
- chemistry
- biology
- finance
- legal
- music
- art
- code
- climate
- medical
pretty_name: Easy Reddit
size_categories:
- 10M<n<100M
configs:
- config_name: shards
data_files:
- split: train
path:
- shard_1.jsonl
- shard_2.jsonl
- shard_3.jsonl
- shard_4.jsonl
- shard_5.jsonl
- shard_6.jsonl
- shard_7.jsonl
- shard_8.jsonl
- shard_9.jsonl
- shard_10.jsonl
- shard_11.jsonl
- shard_12.jsonl
- shard_13.jsonl
- shard_14.jsonl
- shard_15.jsonl
- shard_16.jsonl
- shard_17.jsonl
- shard_18.jsonl
- shard_19.jsonl
- shard_20.jsonl
- shard_21.jsonl
- shard_22.jsonl
- shard_23.jsonl
- shard_24.jsonl
- shard_25.jsonl
- shard_26.jsonl
- shard_27.jsonl
- shard_28.jsonl
- shard_29.jsonl
- shard_30.jsonl
- shard_31.jsonl
- shard_32.jsonl
- shard_33.jsonl
- shard_34.jsonl
---
# 🙋🏻♂️Welcome to 🧑🏻🚀Tonic's🚀🚰Easy🔴Reddit🔥!

This is every "best reddit_question_best_answers" appended and produced according to the following template :
```json
{"prompt": "This is the first prompt", "completion": "This is the first completion"}
{"prompt": "This is the second prompt", "completion": "This is the second completion"}
```

- 🌟 You can use it in shards or all together !
- 🌟 This dataset is **internally consistent** !
🤔The point is to make it easy to train models with a single correctly formatted dataset of
- **54,367,153 rows**
# Original Dataset :
[nreimers/reddit_question_best_answers](https://huggingface.co/datasets/nreimers/reddit_question_best_answers)
# How To Use :
Combine random shards in random quantities to produce a very high quality conversational training dataset for fine tuning or try combining rows line by line to save memory by running the following code:
```python
# see selectbyline.py
import os
import random
# Directory containing the shard JSONL files
shard_directory = "/path/to/shard/directory"
# Get a list of all JSONL files in the directory
shard_files = [f for f in os.listdir(shard_directory) if f.endswith('.jsonl')]
# Function to read a random number of lines (between min_lines and max_lines) from a file
def read_random_lines(filename, min_lines, max_lines):
selected_lines = []
num_lines = random.randint(min_lines, max_lines)
with open(filename, 'r') as file:
lines = list(file)
if len(lines) <= num_lines:
return lines
selected_lines = random.sample(lines, num_lines)
return selected_lines
# Function to combine shards
def combine_shards(output_filename, num_combinations):
with open(output_filename, 'w') as output_file:
for _ in range(num_combinations):
selected_shard_file = random.choice(shard_files)
lines = read_random_lines(os.path.join(shard_directory, selected_shard_file), 5000, 10000)
output_file.writelines(lines)
# Example usage
combine_shards("/path/to/output/combined_shards.jsonl", 10)
```
# Pre-Processing
```python
import json
import os
import gzip
import logging
import re
import random
# Setup basic logging
logging.basicConfig(level=logging.INFO, format="%(asctime)s - %(levelname)s - %(message)s")
def clean_string(s):
"""Remove special characters, keeping only alphanumeric characters and spaces."""
if isinstance(s, list):
# Extract text from each dictionary in the list and join into a single string
s = " ".join([d.get("body", "") if isinstance(d, dict) else str(d) for d in s])
return re.sub(r'[^A-Za-z0-9 ]+', '', s)
def process_file(input_file, output_file):
try:
dataset = []
with gzip.open(input_file, 'rt') as infile:
for line in infile:
# Parse the JSON line
try:
data = json.loads(line)
except json.JSONDecodeError:
logging.error(f"Invalid JSON format in {input_file}: {line}")
continue
# Extract and clean the 'body' and 'answers' fields
prompt = clean_string(data.get("body", ""))
completion = clean_string(data.get("answers", ""))
# For each body found, make a new row and duplicate the prompt for it
if isinstance(data.get("body", ""), list):
for body in data.get("body", []):
cleaned_body = clean_string(body)
dataset.append({"prompt": cleaned_body, "completion": completion})
else:
dataset.append({"prompt": prompt, "completion": completion})
# Shuffle the dataset
random.shuffle(dataset)
# Write the shuffled dataset to the output file
with open(output_file, 'a') as outfile:
for item in dataset:
json.dump(item, outfile)
outfile.write('\n')
logging.info(f"Processed file: {input_file}")
except Exception as e:
logging.error(f"Error processing file {input_file}: {e}")
def process_files(file_list, output_dir):
# Ensure the output directory exists
if not os.path.exists(output_dir):
os.makedirs(output_dir)
# Create a single output file path
output_file = os.path.join(output_dir, 'synthesized_dataset.jsonl')
for input_file in file_list:
process_file(input_file, output_file)
# Update with your list of .gz file paths
file_list = [r'C:\Users\MeMyself\FILES, r"C:\Users\MeMyself\FILES" ] # Update with your list of .gz file paths
output_dir = r'C:\Users\MeMyself\reddit_question_best_answers\processed'
process_files(file_list, output_dir)
```
#### **sharding script** :
```python
import json
import os
def read_dataset(file_path):
try:
with open(file_path, 'r') as file:
data = [json.loads(line) for line in file]
print(f"Dataset loaded successfully from {file_path}.")
return data
except Exception as e:
print(f"Error reading dataset from {file_path}: {e}")
return []
def shard_dataset(dataset, num_shards):
shard_size = len(dataset) // num_shards
shards = [dataset[i:i + shard_size] for i in range(0, len(dataset), shard_size)]
if len(shards) > num_shards:
shards[num_shards - 1].extend(shards.pop())
print(f"Dataset sharded into {num_shards} parts.")
return shards
def write_shards(shards, output_dir):
if not os.path.exists(output_dir):
os.makedirs(output_dir)
print(f"Created output directory at {output_dir}.")
for i, shard in enumerate(shards):
shard_file = os.path.join(output_dir, f'shard_{i+1}.jsonl')
with open(shard_file, 'w') as file:
for item in shard:
json.dump(item, file)
file.write('\n')
print(f"Shard {i+1} written to {shard_file}.")
def main():
input_file = 'path_to_processed_dataset.jsonl' # Update with your processed dataset file path
output_dir = 'sharded_dataset' # Update with your output directory for shards
num_shards = 33
dataset = read_dataset(input_file)
if dataset:
shards = shard_dataset(dataset, num_shards)
write_shards(shards, output_dir)
print("All shards have been successfully written.")
else:
print("No dataset to process.")
if __name__ == "__main__":
main()
```
### Disclaimer :
🌟Re-format this dataset before use.
🌟Probably there's a **big problem with the token count** on these long answers 😉
🌟**Good Luck !** 🧑🏻🚀🚀 |
FINNUMBER/FINCH_TRAIN_SA_FPB_100_NEWFORMAT | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: 'null'
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 125984
num_examples: 100
download_size: 62780
dataset_size: 125984
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hearmeneigh/e621-rising-v3-preliminary-data | ---
dataset_info:
pretty_name: 'E621 Rising V3: Preliminary Data'
viewer: false
tags:
- furry
- anthro
- nsfw
- e621
- not-for-all-audiences
---
# E621 Rising V3: Preliminary Data
Snapshot metadata from E621.net as of 2023-09-21
|
liuyanchen1015/MULTI_VALUE_mnli_here_come | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 39031
num_examples: 178
- name: dev_mismatched
num_bytes: 30888
num_examples: 146
- name: test_matched
num_bytes: 43474
num_examples: 184
- name: test_mismatched
num_bytes: 30680
num_examples: 134
- name: train
num_bytes: 1972131
num_examples: 8633
download_size: 1244138
dataset_size: 2116204
---
# Dataset Card for "MULTI_VALUE_mnli_here_come"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
exowanderer/wikidata_alljsons_21Feb2024 | ---
license: cc
---
|
johannes-garstenauer/structs_token_size_4_use_pd_True_full_amt_False_div_20_unskewed_decrease_True_factor_4 | ---
dataset_info:
features:
- name: struct
dtype: string
splits:
- name: train
num_bytes: 2713413
num_examples: 15091
download_size: 697188
dataset_size: 2713413
---
# Dataset Card for "structs_token_size_4_use_pd_True_full_amt_False_div_20_unskewed_decrease_True_factor_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wbxlala/fogother | ---
license: cc-by-4.0
---
|
bjoernp/the_stack_repo_languages | ---
dataset_info:
features:
- name: text_lang
dtype: string
- name: confidence
dtype: float64
- name: repo_name
dtype: string
splits:
- name: train
num_bytes: 1681449
num_examples: 35913
download_size: 0
dataset_size: 1681449
---
# Dataset Card for "the_stack_repo_languages"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pranjalipathre/multi2img | ---
dataset_info:
- config_name: video_00
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
- name: fr_image
dtype: image
- name: ls_image
dtype: image
- name: oh_image
dtype: image
- name: rs_image
dtype: image
splits:
- name: train
num_bytes: 4765012
num_examples: 4934
download_size: 2825400113
dataset_size: 4765012
- config_name: video_01
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
- name: fr_image
dtype: image
- name: ls_image
dtype: image
- name: oh_image
dtype: image
- name: rs_image
dtype: image
splits:
- name: train
num_bytes: 3965096
num_examples: 4093
download_size: 2675055227
dataset_size: 3965096
---
|
Amani27/massive_translation_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: "train.csv"
- split: validation
path: "validation.csv"
- split: test
path: "test.csv"
license: cc-by-4.0
task_categories:
- translation
language:
- en
- de
- es
- hi
- fr
- it
- ar
- nl
- ja
- pt
size_categories:
- 10K<n<100K
---
# Dataset Card for Massive Dataset for Translation
### Dataset Summary
This dataset is derived from AmazonScience/MASSIVE dataset for translation task purpose.
### Supported Tasks and Leaderboards
Translation
### Languages
1. English (en_US)
2. German (de_DE)
3. Hindi (hi_IN)
4. Spanish (es_ES)
5. French (fr_FR)
6. Italian (it_IT)
7. Arabic (ar_SA)
8. Dutch (nl_NL)
9. Japanese (ja_JP)
10. Portugese (pt_PT)
|
giuliadc/mlsum-de-filtered | ---
task_categories:
- summarization
language:
- de
size_categories:
- 10K<n<100K
---
German MLSUM filtered by using the code by Aumiller et al. (1) available at https://github.com/dennlinger/summaries/tree/main
min_length_summary = 15; min_length_reference = 150; length_metric = "whitespace"
min_compression_ratio = 2.5
bi-gram_overlap_fraction between summary and original text < 0.79
Maximal article length = 512 tokens
(1): Aumiller, D., Fan, J., & Gertz, M. (2023). On the State of German (Abstractive) Text Summarization. arXiv preprint arXiv:2301.07095. |
yunus-emre/eval_history | ---
dataset_info:
features:
- name: E
dtype: string
- name: D
dtype: string
- name: question
dtype: string
- name: C
dtype: string
- name: A
dtype: string
- name: num
dtype: int64
- name: answers
dtype: string
- name: B
dtype: string
splits:
- name: test
num_bytes: 73203
num_examples: 209
download_size: 47951
dataset_size: 73203
configs:
- config_name: default
data_files:
- split: test
path: data/train-*
---
|
JaehyungKim/p2c_dynasent1 | ---
license: other
license_name: following-original-dataset
license_link: LICENSE
---
|
cuichenx/dummy-image-text-dataset | ---
license: apache-2.0
---
|
biznetgio/oasst2-balinese | ---
dataset_info:
features:
- name: message_id
dtype: string
- name: parent_id
dtype: string
- name: user_id
dtype: string
- name: created_date
dtype: string
- name: text
dtype: string
- name: role
dtype: string
- name: lang
dtype: string
- name: review_count
dtype: int64
- name: review_result
dtype: bool
- name: deleted
dtype: bool
- name: rank
dtype: float64
- name: synthetic
dtype: bool
- name: model_name
dtype: 'null'
- name: detoxify
struct:
- name: identity_attack
dtype: float64
- name: insult
dtype: float64
- name: obscene
dtype: float64
- name: severe_toxicity
dtype: float64
- name: sexual_explicit
dtype: float64
- name: threat
dtype: float64
- name: toxicity
dtype: float64
- name: message_tree_id
dtype: string
- name: tree_state
dtype: string
- name: emojis
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: labels
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: value
sequence: float64
splits:
- name: train
num_bytes: 41734658
num_examples: 39283
download_size: 13627462
dataset_size: 41734658
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nayohan/143_type_audio_correction | ---
dataset_info:
features:
- name: text
dtype: string
- name: err_sentence
dtype: string
- name: err_sentence_spell
dtype: string
splits:
- name: train
num_bytes: 93797713
num_examples: 243350
download_size: 33678450
dataset_size: 93797713
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
malaysia-ai/mosaic-starcoder-filtered | ---
language:
- ms
---
# Mosaic format for filtered starcoder dataset to train Malaysian LLM
This repository is to store dataset shards using mosaic format.
1. prepared at https://github.com/malaysia-ai/dedup-text-dataset/blob/main/pretrain-llm/combine-starcoder.ipynb
2. using tokenizer https://huggingface.co/malaysia-ai/bpe-tokenizer
3. 4096 context length.
## how-to
1. git clone,
```bash
git lfs clone https://huggingface.co/datasets/malaysia-ai/mosaic-starcoder-filtered
```
2. load it,
```python
from streaming import LocalDataset
import numpy as np
from streaming.base.format.mds.encodings import Encoding, _encodings
class UInt16(Encoding):
def encode(self, obj) -> bytes:
return obj.tobytes()
def decode(self, data: bytes):
return np.frombuffer(data, np.uint16)
_encodings['uint16'] = UInt16
dataset = LocalDataset('mosaic-starcoder-filtered')
len(dataset)
``` |
MartinMu/StandardDifusion | ---
license: openrail
---
|
dadinghh2/HumTrans | ---
license: cc-by-nc-4.0
---
# HumTrans Dataset
- Dataset Name: HumTrans
- Dataset Type: Humming audio in .wav format and corresponding label MIDI file
- Primary Use: Humming melody transcription and as a foundation for downstream tasks such as humming melody based music generation
- Summary: 500 musical compositions of different genres and languages, 1000 music segments in total; sampled at a frequency of 44,100 Hz; approximately 56.22 hours of audio; 14,614 files in total.
- File Description: all_wav.zip includes all the humming audios in .wav format, all_midi.zip includes all the corresponding label MIDIs in .mid format. Both of these two share the same naming convention, which is personID_musicID_segmentID_repetitionID or personID_musicID_segmentID_repetitionID_[U/D/DD/DDD]. For example, F01_0005_0001_1, or F04_0055_0001_2_DD. train_valid_test_keys.json contains the official split of this dataset, including train, valid and test. |
cmsolson75/LLM_artist_lyrics | ---
license: apache-2.0
---
|
Rootreck/so-vits-svc-4.0-ru-Warcraft_III_Reforged | ---
language:
- ru
---
Это тренировочные данные моделей голосов персонажей из "Warcraft III: Reforged" для so-vits-svc-4.1.1
|
KaiLv/UDR_Subj | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: label
dtype: int64
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 1181174
num_examples: 8000
- name: test
num_bytes: 299358
num_examples: 2000
- name: debug
num_bytes: 737874
num_examples: 5000
download_size: 1474560
dataset_size: 2218406
---
# Dataset Card for "UDR_Subj"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Dan1212121212/parsig_tokenizer | ---
license: apache-2.0
---
|
Sina-Alinejad-2002/divide_operation_prediction | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 78805
num_examples: 70
- name: validation
num_bytes: 5775
num_examples: 5
download_size: 73000
dataset_size: 84580
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
AdapterOcean/chemistry_dataset_standardized_unified | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 44945786
num_examples: 19999
download_size: 20574764
dataset_size: 44945786
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "chemistry_dataset_standardized_unified"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
safikhan/alpaca-sample-1k | ---
license: cc-by-nc-4.0
---
Randomly sampled subset of Alpaca dataset for the LLM Workshop. |
Finnish-NLP/mc4_fi_cleaned | ---
annotations_creators: []
language_creators: []
language:
- fi
license: []
multilinguality:
- monolingual
size_categories:
- unknown
source_datasets:
- extended|mc4
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
pretty_name: mC4 Finnish Cleaned
---
# Dataset Card for mC4 Finnish Cleaned
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** [Needs More Information]
- **Repository:** [Needs More Information]
- **Paper:** [Needs More Information]
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Needs More Information]
### Dataset Summary
mC4 Finnish cleaned is cleaned version of the original mC4 Finnish split.
### Supported Tasks and Leaderboards
mC4 Finnish is mainly intended to pretrain Finnish language models and word representations.
### Languages
Finnish
## Dataset Structure
### Data Instances
[Needs More Information]
### Data Fields
The data have several fields:
- url: url of the source as a string
- text: text content as a string
- timestamp: timestamp as a string
- perplexity_kenlm_full: perplexity of the text calculated by KenLM model
### Data Splits
Train Validation
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
[Needs More Information] |
kyu-kang/dlforcv | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 67733111.0
num_examples: 107
download_size: 67131614
dataset_size: 67733111.0
---
# Dataset Card for "dlforcv"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DanielDimas/tt | ---
license: openrail
---
|
liuyanchen1015/MULTI_VALUE_stsb_if_would | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 5521
num_examples: 25
- name: test
num_bytes: 2441
num_examples: 14
- name: train
num_bytes: 9136
num_examples: 46
download_size: 20772
dataset_size: 17098
---
# Dataset Card for "MULTI_VALUE_stsb_if_would"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
phatle157/test_123 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_Novocoders__Lotus-7B | ---
pretty_name: Evaluation run of Novocoders/Lotus-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Novocoders/Lotus-7B](https://huggingface.co/Novocoders/Lotus-7B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Novocoders__Lotus-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T13:03:53.418083](https://huggingface.co/datasets/open-llm-leaderboard/details_Novocoders__Lotus-7B/blob/main/results_2024-02-09T13-03-53.418083.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6496423371761832,\n\
\ \"acc_stderr\": 0.03215279301715769,\n \"acc_norm\": 0.6501043548092335,\n\
\ \"acc_norm_stderr\": 0.03281746943202546,\n \"mc1\": 0.390452876376989,\n\
\ \"mc1_stderr\": 0.017078230743431448,\n \"mc2\": 0.5556599683913465,\n\
\ \"mc2_stderr\": 0.015308837108837361\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111728,\n\
\ \"acc_norm\": 0.6646757679180887,\n \"acc_norm_stderr\": 0.013796182947785562\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6554471220872337,\n\
\ \"acc_stderr\": 0.0047425103547779025,\n \"acc_norm\": 0.8480382393945429,\n\
\ \"acc_norm_stderr\": 0.003582501596564544\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.027134291628741702,\n\
\ \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.027134291628741702\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723292,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723292\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229876,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229876\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657262,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657262\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113114,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113114\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.030500283176545843,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.030500283176545843\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n\
\ \"acc_stderr\": 0.013468201614066297,\n \"acc_norm\": 0.8288633461047255,\n\
\ \"acc_norm_stderr\": 0.013468201614066297\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508287,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40782122905027934,\n\
\ \"acc_stderr\": 0.016435865260914746,\n \"acc_norm\": 0.40782122905027934,\n\
\ \"acc_norm_stderr\": 0.016435865260914746\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n\
\ \"acc_stderr\": 0.012741974333897226,\n \"acc_norm\": 0.4667535853976532,\n\
\ \"acc_norm_stderr\": 0.012741974333897226\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0190709855896875,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0190709855896875\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.02448448716291397,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.02448448716291397\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.390452876376989,\n\
\ \"mc1_stderr\": 0.017078230743431448,\n \"mc2\": 0.5556599683913465,\n\
\ \"mc2_stderr\": 0.015308837108837361\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8216258879242304,\n \"acc_stderr\": 0.010759352014855946\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6830932524639879,\n \
\ \"acc_stderr\": 0.012815868296721364\n }\n}\n```"
repo_url: https://huggingface.co/Novocoders/Lotus-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|arc:challenge|25_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|gsm8k|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hellaswag|10_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T13-03-53.418083.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T13-03-53.418083.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- '**/details_harness|winogrande|5_2024-02-09T13-03-53.418083.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T13-03-53.418083.parquet'
- config_name: results
data_files:
- split: 2024_02_09T13_03_53.418083
path:
- results_2024-02-09T13-03-53.418083.parquet
- split: latest
path:
- results_2024-02-09T13-03-53.418083.parquet
---
# Dataset Card for Evaluation run of Novocoders/Lotus-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Novocoders/Lotus-7B](https://huggingface.co/Novocoders/Lotus-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Novocoders__Lotus-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T13:03:53.418083](https://huggingface.co/datasets/open-llm-leaderboard/details_Novocoders__Lotus-7B/blob/main/results_2024-02-09T13-03-53.418083.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6496423371761832,
"acc_stderr": 0.03215279301715769,
"acc_norm": 0.6501043548092335,
"acc_norm_stderr": 0.03281746943202546,
"mc1": 0.390452876376989,
"mc1_stderr": 0.017078230743431448,
"mc2": 0.5556599683913465,
"mc2_stderr": 0.015308837108837361
},
"harness|arc:challenge|25": {
"acc": 0.6245733788395904,
"acc_stderr": 0.014150631435111728,
"acc_norm": 0.6646757679180887,
"acc_norm_stderr": 0.013796182947785562
},
"harness|hellaswag|10": {
"acc": 0.6554471220872337,
"acc_stderr": 0.0047425103547779025,
"acc_norm": 0.8480382393945429,
"acc_norm_stderr": 0.003582501596564544
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7358490566037735,
"acc_stderr": 0.027134291628741702,
"acc_norm": 0.7358490566037735,
"acc_norm_stderr": 0.027134291628741702
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997692,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997692
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229876,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229876
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113114,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113114
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.030500283176545843,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.030500283176545843
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179326,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066297,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066297
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508287,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40782122905027934,
"acc_stderr": 0.016435865260914746,
"acc_norm": 0.40782122905027934,
"acc_norm_stderr": 0.016435865260914746
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897226,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897226
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0190709855896875,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0190709855896875
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.02448448716291397,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.02448448716291397
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.390452876376989,
"mc1_stderr": 0.017078230743431448,
"mc2": 0.5556599683913465,
"mc2_stderr": 0.015308837108837361
},
"harness|winogrande|5": {
"acc": 0.8216258879242304,
"acc_stderr": 0.010759352014855946
},
"harness|gsm8k|5": {
"acc": 0.6830932524639879,
"acc_stderr": 0.012815868296721364
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
davidgaofc/RM_inout_bal | ---
dataset_info:
features:
- name: Text
dtype: string
- name: Label
dtype: int64
splits:
- name: train
num_bytes: 433846.3841463415
num_examples: 910
download_size: 193172
dataset_size: 433846.3841463415
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-lener_br-lener_br-f0f34b-15626154 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- lener_br
eval_info:
task: entity_extraction
model: Luciano/bertimbau-base-lener-br-finetuned-lener-br
metrics: []
dataset_name: lener_br
dataset_config: lener_br
dataset_split: train
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: Luciano/bertimbau-base-lener-br-finetuned-lener-br
* Dataset: lener_br
* Config: lener_br
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@https://huggingface.co/Luciano/bertimbau-base-lener-br-finetuned-lener-br](https://huggingface.co/https://huggingface.co/Luciano/bertimbau-base-lener-br-finetuned-lener-br) for evaluating this model. |
RicardoSolares/Messi2 | ---
license: apache-2.0
---
|
JyotiNayak/gpt4-pol-ideologies-small | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
size_categories:
- n<1K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card contains very short paragraphs (2-3 sentences) which are labelled as either 'liberal' or 'conservative'. It has been generated using GPT-4.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
The dataset has been created using this prompt: 'Write a short paragraph (2-3 sentences) expressing a {label} political viewpoint.'
All the entries has also been manually checked to ensure that the paragraph accurately maps to the labels. Note that the lables may not be representative of political discourses outside of the United States.
- **Curated by:** Jyoti S Nayak
- **Language(s) (NLP):** English
- **License:** Apache
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
This dataset can be a great starting point to train models to anaylyse political speeches and legal and political documents.
|
um005 | ---
annotations_creators:
- no-annotation
language_creators:
- other
language:
- en
- ur
license:
- unknown
multilinguality:
- multilingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- translation
task_ids: []
paperswithcode_id: umc005-english-urdu
pretty_name: UMC005 English-Urdu
dataset_info:
- config_name: bible
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- ur
- en
splits:
- name: train
num_bytes: 2350730
num_examples: 7400
- name: validation
num_bytes: 113476
num_examples: 300
- name: test
num_bytes: 104678
num_examples: 257
download_size: 3683565
dataset_size: 2568884
- config_name: quran
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- ur
- en
splits:
- name: train
num_bytes: 2929711
num_examples: 6000
- name: validation
num_bytes: 43499
num_examples: 214
- name: test
num_bytes: 44413
num_examples: 200
download_size: 3683565
dataset_size: 3017623
- config_name: all
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- ur
- en
splits:
- name: train
num_bytes: 5280441
num_examples: 13400
- name: validation
num_bytes: 156963
num_examples: 514
- name: test
num_bytes: 149079
num_examples: 457
download_size: 3683565
dataset_size: 5586483
---
# Dataset Card for UMC005 English-Urdu
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://ufal.ms.mff.cuni.cz/umc/005-en-ur/
- **Repository:** None
- **Paper:** https://www.researchgate.net/publication/268008206_Word-Order_Issues_in_English-to-Urdu_Statistical_Machine_Translation
- **Leaderboard:** [If the dataset supports an active leaderboard, add link here]()
- **Point of Contact:** Bushra Jawaid and Daniel Zeman
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@abhishekkrthakur](https://github.com/abhishekkrthakur) for adding this dataset. |
Stud1os/Daniel_Penin | ---
license: openrail
---
|
OmarN121/train | ---
YAML tags:
- copy-paste the tags obtained with the online tagging app: https://huggingface.co/spaces/huggingface/datasets-tagging
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset. |
guoyu-zhang/usp_2 | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 496440
num_examples: 1000
download_size: 272907
dataset_size: 496440
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
darrel999/business-java-code | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 30579823
num_examples: 53183
download_size: 15957467
dataset_size: 30579823
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nyuuzyou/EMERCOM-questions | ---
annotations_creators:
- crowdsourced
language:
- ru
language_creators:
- crowdsourced
license:
- cc0-1.0
multilinguality:
- monolingual
pretty_name: psi.mchs.gov.ru Questions
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-generation
task_ids:
- language-modeling
---
# Dataset Card for psi.mchs.gov.ru Questions
### Dataset Summary
This dataset contains text-based consultations with Russia's Emergency Psychological Assistance EMERCOM, conducted through their [online web portal](https://psi.mchs.gov.ru). It includes the questions and concerns expressed by individuals seeking support, along with the guidance and advice provided by service psychologists. The dataset can be analyzed to understand the nature of anxieties faced by the public and the techniques employed by psychologists to offer support.
### Languages
The dataset is mostly in Russian, but there may be other languages present.
## Dataset Structure
### Data Fields
This dataset includes the following fields:
- `id`: ID of the question (integer)
- `question`:
- `author`: Name of the person asking the question (string)
- `title`: Title of the question (string)
- `description`: Description of the question (string)
- `answer`:
- `author`: Name of the person who answered (string)
- `content`: Text of the answer (string)
### Data Splits
All examples are in the train split, there is no validation split.
## Additional Information
### License
This dataset is dedicated to the public domain under the Creative Commons Zero (CC0) license. This means you can:
* Use it for any purpose, including commercial projects.
* Modify it however you like.
* Distribute it without asking permission.
No attribution is required, but it's always appreciated!
CC0 license: https://creativecommons.org/publicdomain/zero/1.0/deed.en
To learn more about CC0, visit the Creative Commons website: https://creativecommons.org/publicdomain/zero/1.0/
### Dataset Curators
- [nyuuzyou](https://ducks.party)
|
open-llm-leaderboard/details_Aspik101__30B-Lazarus-instruct-PL-lora_unload | ---
pretty_name: Evaluation run of Aspik101/30B-Lazarus-instruct-PL-lora_unload
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Aspik101/30B-Lazarus-instruct-PL-lora_unload](https://huggingface.co/Aspik101/30B-Lazarus-instruct-PL-lora_unload)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aspik101__30B-Lazarus-instruct-PL-lora_unload\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T08:13:24.195120](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__30B-Lazarus-instruct-PL-lora_unload/blob/main/results_2023-10-17T08-13-24.195120.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.01164010067114094,\n\
\ \"em_stderr\": 0.0010984380734032925,\n \"f1\": 0.07800545302013438,\n\
\ \"f1_stderr\": 0.0017935902090569574,\n \"acc\": 0.4522835158298991,\n\
\ \"acc_stderr\": 0.010087630088457804\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.01164010067114094,\n \"em_stderr\": 0.0010984380734032925,\n\
\ \"f1\": 0.07800545302013438,\n \"f1_stderr\": 0.0017935902090569574\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11372251705837756,\n \
\ \"acc_stderr\": 0.008744810131034036\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7908445146014207,\n \"acc_stderr\": 0.01143045004588157\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Aspik101/30B-Lazarus-instruct-PL-lora_unload
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T08_13_24.195120
path:
- '**/details_harness|drop|3_2023-10-17T08-13-24.195120.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T08-13-24.195120.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T08_13_24.195120
path:
- '**/details_harness|gsm8k|5_2023-10-17T08-13-24.195120.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T08-13-24.195120.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T08_13_24.195120
path:
- '**/details_harness|winogrande|5_2023-10-17T08-13-24.195120.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T08-13-24.195120.parquet'
- config_name: results
data_files:
- split: 2023_10_17T08_13_24.195120
path:
- results_2023-10-17T08-13-24.195120.parquet
- split: latest
path:
- results_2023-10-17T08-13-24.195120.parquet
---
# Dataset Card for Evaluation run of Aspik101/30B-Lazarus-instruct-PL-lora_unload
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Aspik101/30B-Lazarus-instruct-PL-lora_unload
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Aspik101/30B-Lazarus-instruct-PL-lora_unload](https://huggingface.co/Aspik101/30B-Lazarus-instruct-PL-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Aspik101__30B-Lazarus-instruct-PL-lora_unload",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T08:13:24.195120](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__30B-Lazarus-instruct-PL-lora_unload/blob/main/results_2023-10-17T08-13-24.195120.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.01164010067114094,
"em_stderr": 0.0010984380734032925,
"f1": 0.07800545302013438,
"f1_stderr": 0.0017935902090569574,
"acc": 0.4522835158298991,
"acc_stderr": 0.010087630088457804
},
"harness|drop|3": {
"em": 0.01164010067114094,
"em_stderr": 0.0010984380734032925,
"f1": 0.07800545302013438,
"f1_stderr": 0.0017935902090569574
},
"harness|gsm8k|5": {
"acc": 0.11372251705837756,
"acc_stderr": 0.008744810131034036
},
"harness|winogrande|5": {
"acc": 0.7908445146014207,
"acc_stderr": 0.01143045004588157
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta | ---
pretty_name: Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta](https://huggingface.co/ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-13T07:27:37.172195](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta/blob/main/results_2024-02-13T07-27-37.172195.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.595356587300876,\n\
\ \"acc_stderr\": 0.03311822764879789,\n \"acc_norm\": 0.6057673454107737,\n\
\ \"acc_norm_stderr\": 0.0339087917676742,\n \"mc1\": 0.33414932680538556,\n\
\ \"mc1_stderr\": 0.016512530677150535,\n \"mc2\": 0.4975919881917549,\n\
\ \"mc2_stderr\": 0.01579574647682552\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5554607508532423,\n \"acc_stderr\": 0.014521226405627079,\n\
\ \"acc_norm\": 0.5930034129692833,\n \"acc_norm_stderr\": 0.014356399418009128\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6152160924118701,\n\
\ \"acc_stderr\": 0.004855498343308391,\n \"acc_norm\": 0.8133837880900219,\n\
\ \"acc_norm_stderr\": 0.0038880689432920727\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n\
\ \"acc_stderr\": 0.04304979692464241,\n \"acc_norm\": 0.5407407407407407,\n\
\ \"acc_norm_stderr\": 0.04304979692464241\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6528301886792452,\n \"acc_stderr\": 0.02930010170554965,\n\
\ \"acc_norm\": 0.6528301886792452,\n \"acc_norm_stderr\": 0.02930010170554965\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033583,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033583\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.02497695405315525,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.02497695405315525\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.031544498882702866,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.031544498882702866\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.02717121368316453,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.02717121368316453\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n\
\ \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n\
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8055045871559633,\n \"acc_stderr\": 0.01697028909045803,\n \"\
acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.01697028909045803\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.02931281415395593,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.02931281415395593\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.02126271940040696,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.02126271940040696\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n\
\ \"acc_stderr\": 0.014805384478371155,\n \"acc_norm\": 0.7803320561941252,\n\
\ \"acc_norm_stderr\": 0.014805384478371155\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.025131000233647907,\n\
\ \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.025131000233647907\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35083798882681566,\n\
\ \"acc_stderr\": 0.01596103667523096,\n \"acc_norm\": 0.35083798882681566,\n\
\ \"acc_norm_stderr\": 0.01596103667523096\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.026173908506718576,\n\
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.026173908506718576\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n\
\ \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n\
\ \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409818,\n\
\ \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409818\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.02960991207559411,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.02960991207559411\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41199478487614083,\n\
\ \"acc_stderr\": 0.012570871032146073,\n \"acc_norm\": 0.41199478487614083,\n\
\ \"acc_norm_stderr\": 0.012570871032146073\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n\
\ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5996732026143791,\n \"acc_stderr\": 0.019821843688271768,\n \
\ \"acc_norm\": 0.5996732026143791,\n \"acc_norm_stderr\": 0.019821843688271768\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.030713560455108493,\n\
\ \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.030713560455108493\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.029475250236017197,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.029475250236017197\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33414932680538556,\n\
\ \"mc1_stderr\": 0.016512530677150535,\n \"mc2\": 0.4975919881917549,\n\
\ \"mc2_stderr\": 0.01579574647682552\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.755327545382794,\n \"acc_stderr\": 0.012082125654159738\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05686125852918878,\n \
\ \"acc_stderr\": 0.006378790242099631\n }\n}\n```"
repo_url: https://huggingface.co/ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|arc:challenge|25_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|gsm8k|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hellaswag|10_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T07-27-37.172195.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T07-27-37.172195.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- '**/details_harness|winogrande|5_2024-02-13T07-27-37.172195.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-13T07-27-37.172195.parquet'
- config_name: results
data_files:
- split: 2024_02_13T07_27_37.172195
path:
- results_2024-02-13T07-27-37.172195.parquet
- split: latest
path:
- results_2024-02-13T07-27-37.172195.parquet
---
# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta](https://huggingface.co/ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T07:27:37.172195](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta/blob/main/results_2024-02-13T07-27-37.172195.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.595356587300876,
"acc_stderr": 0.03311822764879789,
"acc_norm": 0.6057673454107737,
"acc_norm_stderr": 0.0339087917676742,
"mc1": 0.33414932680538556,
"mc1_stderr": 0.016512530677150535,
"mc2": 0.4975919881917549,
"mc2_stderr": 0.01579574647682552
},
"harness|arc:challenge|25": {
"acc": 0.5554607508532423,
"acc_stderr": 0.014521226405627079,
"acc_norm": 0.5930034129692833,
"acc_norm_stderr": 0.014356399418009128
},
"harness|hellaswag|10": {
"acc": 0.6152160924118701,
"acc_stderr": 0.004855498343308391,
"acc_norm": 0.8133837880900219,
"acc_norm_stderr": 0.0038880689432920727
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464241,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464241
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6528301886792452,
"acc_stderr": 0.02930010170554965,
"acc_norm": 0.6528301886792452,
"acc_norm_stderr": 0.02930010170554965
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033583,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033583
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.02497695405315525,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.02497695405315525
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.031544498882702866,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.031544498882702866
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.02717121368316453,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.02717121368316453
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8055045871559633,
"acc_stderr": 0.01697028909045803,
"acc_norm": 0.8055045871559633,
"acc_norm_stderr": 0.01697028909045803
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.02931281415395593,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.02931281415395593
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.02126271940040696,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.02126271940040696
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.014805384478371155,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.014805384478371155
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6791907514450867,
"acc_stderr": 0.025131000233647907,
"acc_norm": 0.6791907514450867,
"acc_norm_stderr": 0.025131000233647907
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35083798882681566,
"acc_stderr": 0.01596103667523096,
"acc_norm": 0.35083798882681566,
"acc_norm_stderr": 0.01596103667523096
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.026173908506718576,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.026173908506718576
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409818,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409818
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.02960991207559411,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.02960991207559411
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41199478487614083,
"acc_stderr": 0.012570871032146073,
"acc_norm": 0.41199478487614083,
"acc_norm_stderr": 0.012570871032146073
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5996732026143791,
"acc_stderr": 0.019821843688271768,
"acc_norm": 0.5996732026143791,
"acc_norm_stderr": 0.019821843688271768
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017197,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017197
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33414932680538556,
"mc1_stderr": 0.016512530677150535,
"mc2": 0.4975919881917549,
"mc2_stderr": 0.01579574647682552
},
"harness|winogrande|5": {
"acc": 0.755327545382794,
"acc_stderr": 0.012082125654159738
},
"harness|gsm8k|5": {
"acc": 0.05686125852918878,
"acc_stderr": 0.006378790242099631
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
sawradip/bn_trans_data | ---
dataset_info:
features:
- name: bn
dtype: string
- name: en
dtype: string
- name: merged
dtype: string
- name: length
dtype: int64
splits:
- name: train
num_bytes: 1305725945
num_examples: 2379749
- name: test
num_bytes: 792778
num_examples: 1000
- name: validation
num_bytes: 487273
num_examples: 597
download_size: 721412382
dataset_size: 1307005996
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
DTU54DL/common3k-train-prepared | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2881415928
num_examples: 3000
download_size: 493426586
dataset_size: 2881415928
---
# Dataset Card for "common3k-train-prepared"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hojzas/autotrain-data-test | ---
license: apache-2.0
dataset_info:
features:
- name: autotrain_text
dtype: string
- name: autotrain_label
dtype:
class_label:
names:
'0': negative
'1': positive
splits:
- name: train
num_bytes: 167
num_examples: 6
- name: validation
num_bytes: 52
num_examples: 2
download_size: 3117
dataset_size: 219
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
nuprl/EditPackFT-Multi | ---
dataset_info:
features:
- name: commit
dtype: string
- name: old_file
dtype: string
- name: new_file
dtype: string
- name: old_contents
dtype: string
- name: new_contents
dtype: string
- name: subject
dtype: string
- name: message
dtype: string
- name: lang
dtype: string
- name: license
dtype: string
- name: repos
dtype: string
- name: config
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 1122708281.0324206
num_examples: 306133
download_size: 514249902
dataset_size: 1122708281.0324206
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Multilingual version of https://huggingface.co/datasets/nuprl/EditPackFT
## Citation
If you use our work, please cite our paper as such:
```
@misc{cassano2023edit,
title={Can It Edit? Evaluating the Ability of Large Language Models to Follow Code Editing Instructions},
author={Federico Cassano and Luisa Li and Akul Sethi and Noah Shinn and Abby Brennan-Jones and Anton Lozhkov and Carolyn Jane Anderson and Arjun Guha},
year={2023},
eprint={2312.12450},
archivePrefix={arXiv},
primaryClass={cs.SE}
}
``` |
IWSLT/da2023 | ---
license: cc-by-nc-nd-4.0
---
|
Valarmathy/cricket_indvspak | ---
license: cc0-1.0
configs:
- config_name: Valarmathy--cricket_indvspak
task_categories:
- table-question-answering
- tabular-classification
size_categories:
- 1K<n<10K
--- |
Azure99/blossom-math-v1 | ---
license: apache-2.0
task_categories:
- text-generation
- text2text-generation
language:
- zh
size_categories:
- 10K<n<100K
---
# BLOSSOM MATH V1
### 介绍
[Blossom Math V3](https://huggingface.co/datasets/Azure99/blossom-math-v3)版本已发布!🤗
Blossom Math V1是基于Math23K衍生而来的中文数学对话数据集,适用于数学问题微调。
本数据集采用全量Math23K的问题,随后调用gpt-3.5-turbo-0613生成结果,并使用原始数据集中的答案对生成的结果进行验证,过滤掉错误答案,很大程度上保证了问题和答案的准确性。
本次发布了全量数据的50%,包含10K记录。
### 语言
中文
### 数据集结构
每条数据代表一个完整的题目及答案,包含id、input、output、answer四个字段。
- id:字符串,代表Math23K中的题目id。
- input:字符串,代表问题。
- output:字符串,代表gpt-3.5-turbo-0613生成的答案。
- answer:字符串,代表正确答案。
### 数据集限制
本数据集的所有响应均由gpt-3.5-turbo-0613生成,并经过初步校验,但仍可能包含不准确的回答。 |
albertvillanova/dummy-version | ---
license: openrail
source_datasets:
- extended|go_emotions
---
# Dataset version 2
Work in progress...
|
CyberHarem/layla_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of layla (THE iDOLM@STER: Cinderella Girls)
This is the dataset of layla (THE iDOLM@STER: Cinderella Girls), containing 324 images and their tags.
The core tags of this character are `blonde_hair, long_hair, dark_skin, dark-skinned_female, aqua_eyes, blue_eyes, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 324 | 338.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/layla_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 324 | 221.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/layla_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 734 | 458.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/layla_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 324 | 311.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/layla_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 734 | 604.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/layla_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/layla_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, open_mouth, school_uniform, ribbon, simple_background, smile |
| 1 | 10 |  |  |  |  |  | 1girl, collared_shirt, neck_ribbon, open_mouth, white_shirt, open_jacket, red_ribbon, school_uniform, blue_jacket, solo, blazer, :d, pleated_skirt, upper_body, white_background, blush, long_sleeves, looking_at_viewer, parted_bangs, simple_background, very_long_hair |
| 2 | 6 |  |  |  |  |  | 1girl, bare_shoulders, solo, choker, detached_sleeves, looking_at_viewer, midriff, necklace, straight_hair, arabian_clothes, bracelet, hair_flower, nail_polish, navel, skirt, smile, closed_mouth, dancer, veil |
| 3 | 11 |  |  |  |  |  | 1girl, navel, solo, white_bikini, looking_at_viewer, small_breasts, blush, smile, micro_bikini, simple_background, white_background, open_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | blush | open_mouth | school_uniform | ribbon | simple_background | smile | collared_shirt | neck_ribbon | white_shirt | open_jacket | red_ribbon | blue_jacket | blazer | :d | pleated_skirt | upper_body | white_background | long_sleeves | parted_bangs | very_long_hair | bare_shoulders | choker | detached_sleeves | midriff | necklace | straight_hair | arabian_clothes | bracelet | hair_flower | nail_polish | navel | skirt | closed_mouth | dancer | veil | white_bikini | small_breasts | micro_bikini |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:-------------|:-----------------|:---------|:--------------------|:--------|:-----------------|:--------------|:--------------|:--------------|:-------------|:--------------|:---------|:-----|:----------------|:-------------|:-------------------|:---------------|:---------------|:-----------------|:-----------------|:---------|:-------------------|:----------|:-----------|:----------------|:------------------|:-----------|:--------------|:--------------|:--------|:--------|:---------------|:---------|:-------|:---------------|:----------------|:---------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | |
| 3 | 11 |  |  |  |  |  | X | X | X | X | X | | | X | X | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | X | X | X |
|
FirulAI/test_meters | ---
license: apache-2.0
configs:
- config_name: tab
data_files:
- split: train
path: "train.csv"
- split: test
path: "test.csv"
sep: "\t"
---
|
DBQ/Mr.Porter.Product.prices.Sweden | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: Sweden - Mr Porter - Product-level price list
tags:
- webscraping
- ecommerce
- Mr Porter
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: int64
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 9100964
num_examples: 27695
download_size: 2082496
dataset_size: 9100964
---
# Mr Porter web scraped data
## About the website
The **EMEA region**, specifically **Sweden**, has seen a significant rise in the **luxury online retail industry**, where **Mr Porter** operates. The growth has primarily been driven by the fast-paced digitalization, significant internet penetration, and a growing number of digitally native consumers. Additionally, Swedish consumers, renowned for their fashion-forward approach, have demonstrated a strong appetite for luxury fashion products, favouring convenience, choice, and quality offered by online retailers like Mr Porter. The dataset in question provides comprehensive insight into **Ecommerce product-list page (PLP) data** on Mr Porters operation in Sweden, unveiling valuable customer trends and product preferences.
## Link to **dataset**
[Sweden - Mr Porter - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Mr%20Porter%20Product-prices%20Sweden/r/recKNdRjHlSCpbHYz)
|
awettig/Pile-FreeLaw-0.5B-8K-opt | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 6500629798
num_examples: 61035
- name: test
num_bytes: 64969880
num_examples: 610
download_size: 1532312780
dataset_size: 6565599678
---
# Dataset Card for "Pile-FreeLaw-0.5B-8K-opt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Qwen__Qwen1.5-0.5B-Chat | ---
pretty_name: Evaluation run of Qwen/Qwen1.5-0.5B-Chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Qwen/Qwen1.5-0.5B-Chat](https://huggingface.co/Qwen/Qwen1.5-0.5B-Chat) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Qwen__Qwen1.5-0.5B-Chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-04T12:58:21.089741](https://huggingface.co/datasets/open-llm-leaderboard/details_Qwen__Qwen1.5-0.5B-Chat/blob/main/results_2024-03-04T12-58-21.089741.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3365001222738154,\n\
\ \"acc_stderr\": 0.03347053932587535,\n \"acc_norm\": 0.33938163452253794,\n\
\ \"acc_norm_stderr\": 0.03425564830442555,\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476206,\n \"mc2\": 0.4295422712870499,\n\
\ \"mc2_stderr\": 0.015116432520822828\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.26706484641638223,\n \"acc_stderr\": 0.012928933196496363,\n\
\ \"acc_norm\": 0.3054607508532423,\n \"acc_norm_stderr\": 0.013460080478002498\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3593905596494722,\n\
\ \"acc_stderr\": 0.004788412062375709,\n \"acc_norm\": 0.44074885480979886,\n\
\ \"acc_norm_stderr\": 0.00495462230873898\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.03785714465066654,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.03785714465066654\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.39,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3471698113207547,\n \"acc_stderr\": 0.029300101705549655,\n\
\ \"acc_norm\": 0.3471698113207547,\n \"acc_norm_stderr\": 0.029300101705549655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2916666666666667,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.2916666666666667,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.31213872832369943,\n\
\ \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.31213872832369943,\n\
\ \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179327,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179327\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.02802022627120022,\n\
\ \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.02802022627120022\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.03892431106518754,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.03892431106518754\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400175,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400175\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.03764950879790606,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.03764950879790606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.29354838709677417,\n \"acc_stderr\": 0.0259060870213193,\n \"\
acc_norm\": 0.29354838709677417,\n \"acc_norm_stderr\": 0.0259060870213193\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.031785297106427496,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.031785297106427496\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.47878787878787876,\n \"acc_stderr\": 0.03900828913737302,\n\
\ \"acc_norm\": 0.47878787878787876,\n \"acc_norm_stderr\": 0.03900828913737302\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.033586181457325226,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.033586181457325226\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3160621761658031,\n \"acc_stderr\": 0.03355397369686174,\n\
\ \"acc_norm\": 0.3160621761658031,\n \"acc_norm_stderr\": 0.03355397369686174\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2512820512820513,\n \"acc_stderr\": 0.021992016662370547,\n\
\ \"acc_norm\": 0.2512820512820513,\n \"acc_norm_stderr\": 0.021992016662370547\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31932773109243695,\n \"acc_stderr\": 0.030283995525884396,\n\
\ \"acc_norm\": 0.31932773109243695,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3706422018348624,\n \"acc_stderr\": 0.020707458164352988,\n \"\
acc_norm\": 0.3706422018348624,\n \"acc_norm_stderr\": 0.020707458164352988\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2638888888888889,\n \"acc_stderr\": 0.030058202704309846,\n \"\
acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.030058202704309846\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4117647058823529,\n \"acc_stderr\": 0.0345423658538061,\n \"acc_norm\"\
: 0.4117647058823529,\n \"acc_norm_stderr\": 0.0345423658538061\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.45147679324894513,\n \"acc_stderr\": 0.032393600173974704,\n \"\
acc_norm\": 0.45147679324894513,\n \"acc_norm_stderr\": 0.032393600173974704\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2556053811659193,\n\
\ \"acc_stderr\": 0.029275891003969927,\n \"acc_norm\": 0.2556053811659193,\n\
\ \"acc_norm_stderr\": 0.029275891003969927\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3816793893129771,\n \"acc_stderr\": 0.0426073515764456,\n\
\ \"acc_norm\": 0.3816793893129771,\n \"acc_norm_stderr\": 0.0426073515764456\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.48760330578512395,\n \"acc_stderr\": 0.045629515481807666,\n \"\
acc_norm\": 0.48760330578512395,\n \"acc_norm_stderr\": 0.045629515481807666\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.294478527607362,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.294478527607362,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.46601941747572817,\n \"acc_stderr\": 0.049392914472734785,\n\
\ \"acc_norm\": 0.46601941747572817,\n \"acc_norm_stderr\": 0.049392914472734785\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.44871794871794873,\n\
\ \"acc_stderr\": 0.0325833464938688,\n \"acc_norm\": 0.44871794871794873,\n\
\ \"acc_norm_stderr\": 0.0325833464938688\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3384418901660281,\n\
\ \"acc_stderr\": 0.01692086958621066,\n \"acc_norm\": 0.3384418901660281,\n\
\ \"acc_norm_stderr\": 0.01692086958621066\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4161849710982659,\n \"acc_stderr\": 0.026538189104705474,\n\
\ \"acc_norm\": 0.4161849710982659,\n \"acc_norm_stderr\": 0.026538189104705474\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23687150837988827,\n\
\ \"acc_stderr\": 0.014219570788103982,\n \"acc_norm\": 0.23687150837988827,\n\
\ \"acc_norm_stderr\": 0.014219570788103982\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.027826109307283686,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.027826109307283686\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3762057877813505,\n\
\ \"acc_stderr\": 0.02751392568354943,\n \"acc_norm\": 0.3762057877813505,\n\
\ \"acc_norm_stderr\": 0.02751392568354943\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.345679012345679,\n \"acc_stderr\": 0.02646248777700188,\n\
\ \"acc_norm\": 0.345679012345679,\n \"acc_norm_stderr\": 0.02646248777700188\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880582,\n \
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880582\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.29335071707953064,\n\
\ \"acc_stderr\": 0.011628520449582071,\n \"acc_norm\": 0.29335071707953064,\n\
\ \"acc_norm_stderr\": 0.011628520449582071\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.25735294117647056,\n \"acc_stderr\": 0.0265565194700415,\n\
\ \"acc_norm\": 0.25735294117647056,\n \"acc_norm_stderr\": 0.0265565194700415\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.33169934640522875,\n \"acc_stderr\": 0.01904748523936038,\n \
\ \"acc_norm\": 0.33169934640522875,\n \"acc_norm_stderr\": 0.01904748523936038\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.39090909090909093,\n\
\ \"acc_stderr\": 0.04673752333670237,\n \"acc_norm\": 0.39090909090909093,\n\
\ \"acc_norm_stderr\": 0.04673752333670237\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.025607375986579153,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.025607375986579153\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4925373134328358,\n\
\ \"acc_stderr\": 0.035351400842767194,\n \"acc_norm\": 0.4925373134328358,\n\
\ \"acc_norm_stderr\": 0.035351400842767194\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.37349397590361444,\n\
\ \"acc_stderr\": 0.037658451171688624,\n \"acc_norm\": 0.37349397590361444,\n\
\ \"acc_norm_stderr\": 0.037658451171688624\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30409356725146197,\n \"acc_stderr\": 0.03528211258245232,\n\
\ \"acc_norm\": 0.30409356725146197,\n \"acc_norm_stderr\": 0.03528211258245232\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476206,\n \"mc2\": 0.4295422712870499,\n\
\ \"mc2_stderr\": 0.015116432520822828\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5461720599842147,\n \"acc_stderr\": 0.01399244156370708\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07657316148597422,\n \
\ \"acc_stderr\": 0.007324564881451572\n }\n}\n```"
repo_url: https://huggingface.co/Qwen/Qwen1.5-0.5B-Chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|arc:challenge|25_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|gsm8k|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hellaswag|10_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T12-58-21.089741.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-04T12-58-21.089741.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- '**/details_harness|winogrande|5_2024-03-04T12-58-21.089741.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-04T12-58-21.089741.parquet'
- config_name: results
data_files:
- split: 2024_03_04T12_58_21.089741
path:
- results_2024-03-04T12-58-21.089741.parquet
- split: latest
path:
- results_2024-03-04T12-58-21.089741.parquet
---
# Dataset Card for Evaluation run of Qwen/Qwen1.5-0.5B-Chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Qwen/Qwen1.5-0.5B-Chat](https://huggingface.co/Qwen/Qwen1.5-0.5B-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Qwen__Qwen1.5-0.5B-Chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-04T12:58:21.089741](https://huggingface.co/datasets/open-llm-leaderboard/details_Qwen__Qwen1.5-0.5B-Chat/blob/main/results_2024-03-04T12-58-21.089741.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3365001222738154,
"acc_stderr": 0.03347053932587535,
"acc_norm": 0.33938163452253794,
"acc_norm_stderr": 0.03425564830442555,
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476206,
"mc2": 0.4295422712870499,
"mc2_stderr": 0.015116432520822828
},
"harness|arc:challenge|25": {
"acc": 0.26706484641638223,
"acc_stderr": 0.012928933196496363,
"acc_norm": 0.3054607508532423,
"acc_norm_stderr": 0.013460080478002498
},
"harness|hellaswag|10": {
"acc": 0.3593905596494722,
"acc_stderr": 0.004788412062375709,
"acc_norm": 0.44074885480979886,
"acc_norm_stderr": 0.00495462230873898
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066654,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066654
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3471698113207547,
"acc_stderr": 0.029300101705549655,
"acc_norm": 0.3471698113207547,
"acc_norm_stderr": 0.029300101705549655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.31213872832369943,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.31213872832369943,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179327,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179327
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.02802022627120022,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.02802022627120022
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518754,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518754
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400175,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400175
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790606,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.29354838709677417,
"acc_stderr": 0.0259060870213193,
"acc_norm": 0.29354838709677417,
"acc_norm_stderr": 0.0259060870213193
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.031785297106427496,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.031785297106427496
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.47878787878787876,
"acc_stderr": 0.03900828913737302,
"acc_norm": 0.47878787878787876,
"acc_norm_stderr": 0.03900828913737302
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.033586181457325226,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.033586181457325226
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3160621761658031,
"acc_stderr": 0.03355397369686174,
"acc_norm": 0.3160621761658031,
"acc_norm_stderr": 0.03355397369686174
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2512820512820513,
"acc_stderr": 0.021992016662370547,
"acc_norm": 0.2512820512820513,
"acc_norm_stderr": 0.021992016662370547
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31932773109243695,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.31932773109243695,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3706422018348624,
"acc_stderr": 0.020707458164352988,
"acc_norm": 0.3706422018348624,
"acc_norm_stderr": 0.020707458164352988
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.030058202704309846,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.030058202704309846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.0345423658538061,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.0345423658538061
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.45147679324894513,
"acc_stderr": 0.032393600173974704,
"acc_norm": 0.45147679324894513,
"acc_norm_stderr": 0.032393600173974704
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2556053811659193,
"acc_stderr": 0.029275891003969927,
"acc_norm": 0.2556053811659193,
"acc_norm_stderr": 0.029275891003969927
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3816793893129771,
"acc_stderr": 0.0426073515764456,
"acc_norm": 0.3816793893129771,
"acc_norm_stderr": 0.0426073515764456
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.48760330578512395,
"acc_stderr": 0.045629515481807666,
"acc_norm": 0.48760330578512395,
"acc_norm_stderr": 0.045629515481807666
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.294478527607362,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.294478527607362,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.46601941747572817,
"acc_stderr": 0.049392914472734785,
"acc_norm": 0.46601941747572817,
"acc_norm_stderr": 0.049392914472734785
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.44871794871794873,
"acc_stderr": 0.0325833464938688,
"acc_norm": 0.44871794871794873,
"acc_norm_stderr": 0.0325833464938688
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3384418901660281,
"acc_stderr": 0.01692086958621066,
"acc_norm": 0.3384418901660281,
"acc_norm_stderr": 0.01692086958621066
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.026538189104705474,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.026538189104705474
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23687150837988827,
"acc_stderr": 0.014219570788103982,
"acc_norm": 0.23687150837988827,
"acc_norm_stderr": 0.014219570788103982
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.027826109307283686,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.027826109307283686
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3762057877813505,
"acc_stderr": 0.02751392568354943,
"acc_norm": 0.3762057877813505,
"acc_norm_stderr": 0.02751392568354943
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.345679012345679,
"acc_stderr": 0.02646248777700188,
"acc_norm": 0.345679012345679,
"acc_norm_stderr": 0.02646248777700188
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880582,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880582
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.29335071707953064,
"acc_stderr": 0.011628520449582071,
"acc_norm": 0.29335071707953064,
"acc_norm_stderr": 0.011628520449582071
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.25735294117647056,
"acc_stderr": 0.0265565194700415,
"acc_norm": 0.25735294117647056,
"acc_norm_stderr": 0.0265565194700415
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.33169934640522875,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.33169934640522875,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.39090909090909093,
"acc_stderr": 0.04673752333670237,
"acc_norm": 0.39090909090909093,
"acc_norm_stderr": 0.04673752333670237
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2,
"acc_stderr": 0.025607375986579153,
"acc_norm": 0.2,
"acc_norm_stderr": 0.025607375986579153
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4925373134328358,
"acc_stderr": 0.035351400842767194,
"acc_norm": 0.4925373134328358,
"acc_norm_stderr": 0.035351400842767194
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-virology|5": {
"acc": 0.37349397590361444,
"acc_stderr": 0.037658451171688624,
"acc_norm": 0.37349397590361444,
"acc_norm_stderr": 0.037658451171688624
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30409356725146197,
"acc_stderr": 0.03528211258245232,
"acc_norm": 0.30409356725146197,
"acc_norm_stderr": 0.03528211258245232
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476206,
"mc2": 0.4295422712870499,
"mc2_stderr": 0.015116432520822828
},
"harness|winogrande|5": {
"acc": 0.5461720599842147,
"acc_stderr": 0.01399244156370708
},
"harness|gsm8k|5": {
"acc": 0.07657316148597422,
"acc_stderr": 0.007324564881451572
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
VIRBHADRa/Ywhwh | ---
license: bigscience-openrail-m
---
|
mitsudate/DeepVocal_Install_Mirror | ---
license: mit
---
|
Aditya757864/Audio_transcript | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': Test
'1': Train
splits:
- name: train
num_bytes: 14278558.0
num_examples: 79
download_size: 12571738
dataset_size: 14278558.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
anton-l/wiki_embeddings | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: embedding
sequence: float32
splits:
- name: train
num_bytes: 349480512
num_examples: 58650
download_size: 339264528
dataset_size: 349480512
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
valashir/SMM2-levels-all | ---
dataset_info:
features:
- name: id
dtype: int64
- name: level
sequence:
sequence:
sequence: uint8
- name: text
dtype: string
- name: text-baseline
dtype: string
splits:
- name: train
num_bytes: 30754342973
num_examples: 202096
- name: val
num_bytes: 308874924
num_examples: 2048
download_size: 271196710
dataset_size: 31063217897
---
# Dataset Card for "SMM2-levels-all"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.