datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
gustproof/skeb | ---
license: cc-by-sa-4.0
---
|
open-llm-leaderboard/details_JunchengXie__Mistral-7B-v0.1-gpt-4-20k | ---
pretty_name: Evaluation run of JunchengXie/Mistral-7B-v0.1-gpt-4-20k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JunchengXie/Mistral-7B-v0.1-gpt-4-20k](https://huggingface.co/JunchengXie/Mistral-7B-v0.1-gpt-4-20k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JunchengXie__Mistral-7B-v0.1-gpt-4-20k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-13T18:03:52.010438](https://huggingface.co/datasets/open-llm-leaderboard/details_JunchengXie__Mistral-7B-v0.1-gpt-4-20k/blob/main/results_2024-03-13T18-03-52.010438.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6252500387361519,\n\
\ \"acc_stderr\": 0.03261842726074758,\n \"acc_norm\": 0.6316885825477706,\n\
\ \"acc_norm_stderr\": 0.0332676883678673,\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5469553712101578,\n\
\ \"mc2_stderr\": 0.015264090267771462\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6023890784982935,\n \"acc_stderr\": 0.01430175222327954,\n\
\ \"acc_norm\": 0.6271331058020477,\n \"acc_norm_stderr\": 0.01413117676013117\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6164110734913364,\n\
\ \"acc_stderr\": 0.004852658876775384,\n \"acc_norm\": 0.8172674765982872,\n\
\ \"acc_norm_stderr\": 0.0038565729468310055\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851112,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851112\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.02432173848460235,\n \
\ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.02432173848460235\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473072,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473072\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.03128217706368461,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.03128217706368461\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8201834862385321,\n\
\ \"acc_stderr\": 0.016465345467391534,\n \"acc_norm\": 0.8201834862385321,\n\
\ \"acc_norm_stderr\": 0.016465345467391534\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.0340763209385405,\n\
\ \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.0340763209385405\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159263,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159263\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"\
acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077816,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077816\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001506,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001506\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624734,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624734\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2927374301675978,\n\
\ \"acc_stderr\": 0.015218109544410174,\n \"acc_norm\": 0.2927374301675978,\n\
\ \"acc_norm_stderr\": 0.015218109544410174\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.02465968518596729,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.02465968518596729\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43741851368970014,\n\
\ \"acc_stderr\": 0.012669813464935726,\n \"acc_norm\": 0.43741851368970014,\n\
\ \"acc_norm_stderr\": 0.012669813464935726\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031218,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031218\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162666,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162666\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5469553712101578,\n\
\ \"mc2_stderr\": 0.015264090267771462\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7292817679558011,\n \"acc_stderr\": 0.012487904760626306\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.36694465504169826,\n \
\ \"acc_stderr\": 0.01327588304771222\n }\n}\n```"
repo_url: https://huggingface.co/JunchengXie/Mistral-7B-v0.1-gpt-4-20k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|arc:challenge|25_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|gsm8k|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hellaswag|10_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T18-03-52.010438.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T18-03-52.010438.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- '**/details_harness|winogrande|5_2024-03-13T18-03-52.010438.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-13T18-03-52.010438.parquet'
- config_name: results
data_files:
- split: 2024_03_13T18_03_52.010438
path:
- results_2024-03-13T18-03-52.010438.parquet
- split: latest
path:
- results_2024-03-13T18-03-52.010438.parquet
---
# Dataset Card for Evaluation run of JunchengXie/Mistral-7B-v0.1-gpt-4-20k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [JunchengXie/Mistral-7B-v0.1-gpt-4-20k](https://huggingface.co/JunchengXie/Mistral-7B-v0.1-gpt-4-20k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JunchengXie__Mistral-7B-v0.1-gpt-4-20k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-13T18:03:52.010438](https://huggingface.co/datasets/open-llm-leaderboard/details_JunchengXie__Mistral-7B-v0.1-gpt-4-20k/blob/main/results_2024-03-13T18-03-52.010438.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6252500387361519,
"acc_stderr": 0.03261842726074758,
"acc_norm": 0.6316885825477706,
"acc_norm_stderr": 0.0332676883678673,
"mc1": 0.3733170134638923,
"mc1_stderr": 0.016932370557570634,
"mc2": 0.5469553712101578,
"mc2_stderr": 0.015264090267771462
},
"harness|arc:challenge|25": {
"acc": 0.6023890784982935,
"acc_stderr": 0.01430175222327954,
"acc_norm": 0.6271331058020477,
"acc_norm_stderr": 0.01413117676013117
},
"harness|hellaswag|10": {
"acc": 0.6164110734913364,
"acc_stderr": 0.004852658876775384,
"acc_norm": 0.8172674765982872,
"acc_norm_stderr": 0.0038565729468310055
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851112,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851112
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.02432173848460235,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.02432173848460235
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473072,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473072
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.03128217706368461,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.03128217706368461
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.016465345467391534,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.016465345467391534
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.0340763209385405,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.0340763209385405
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077816,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077816
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001506,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001506
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2927374301675978,
"acc_stderr": 0.015218109544410174,
"acc_norm": 0.2927374301675978,
"acc_norm_stderr": 0.015218109544410174
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.02465968518596729,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.02465968518596729
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43741851368970014,
"acc_stderr": 0.012669813464935726,
"acc_norm": 0.43741851368970014,
"acc_norm_stderr": 0.012669813464935726
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031218,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031218
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162666,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162666
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3733170134638923,
"mc1_stderr": 0.016932370557570634,
"mc2": 0.5469553712101578,
"mc2_stderr": 0.015264090267771462
},
"harness|winogrande|5": {
"acc": 0.7292817679558011,
"acc_stderr": 0.012487904760626306
},
"harness|gsm8k|5": {
"acc": 0.36694465504169826,
"acc_stderr": 0.01327588304771222
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
carlavic/Chicoin | ---
license: openrail
---
|
Gatozu35/DNSMOS-TTS | ---
pretty_name: DNSMOS Score for common TTS datasets
---
(Placeholder)
# DNSMOS-TTS
DNSMOS-TTS contains DNSMOS Scores for common TTS datasets
This repo uses Lhotse to manage datasets.
For example, to load LJ-Speech:
```py
from lhotse import CutSet
for cut in CutSet.from_webdataset("pipe:curl -s -L https://huggingface.co/datasets/Gatozu35/DNSMOS-TTS/resolve/main/ljspeech_mos.tar"):
wav = cut.load_audio()
mos = cut.supervisions[0].custom["mos"]
...
```
If you don't want to use lhotse, I have also uploaded a csv of the scores for each id.
|
thisisgonz/falopita | ---
license: openrail
---
|
Taehun81/testDataset | ---
license: cc-by-4.0
---
|
EgilKarlsen/AA_GPT2_FT | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 80318765
num_examples: 26057
- name: test
num_bytes: 26774056
num_examples: 8686
download_size: 147157938
dataset_size: 107092821
---
# Dataset Card for "AA_GPT2_FT"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
reralle/n-f-n | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': arabic
'1': dutch
'2': french
'3': korean
'4': mandarin
'5': portuguese
'6': russian
'7': spanish
'8': uk
'9': usa
splits:
- name: train
num_bytes: 689121002.0
num_examples: 784
- name: test
num_bytes: 51756368.0
num_examples: 60
- name: validation
num_bytes: 51955368.0
num_examples: 60
download_size: 752141512
dataset_size: 792832738.0
---
# Dataset Card for "n-f-n"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
seablue/DiDi_GAIA_dataset_jsonl | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: most_similar_instructions
struct:
- name: 下面我给出了一段代码,请你帮我给下面代码加上注释。
dtype: float64
- name: 下面是一段代码,请你添加注释,以便于其他人更好地了解代码。
dtype: float64
- name: 下面是一段可以自动化为你的代码添加注释的工具,请你根据这个工具的描述,使用它为你的代码添加注释。
dtype: float64
- name: 下面是一段需要加注释的代码,请为每一行添加注释并描述其作用。
dtype: float64
- name: 下面的代码令人困惑,请为每行添加注释以解释其含义。
dtype: float64
- name: 下面的代码可能会让其他人感到困惑,请为每一行添加注释以便于其他人理解。
dtype: float64
- name: 下面的代码需要添加注释以解释代码实现的逻辑,请您为其添加注释。
dtype: float64
- name: 下面的代码需要添加注释来解释代码的目的,请你给出相应的注释。
dtype: float64
- name: 下面这段代码需要加入一些注释以便后续使用,请你帮忙补充一下。
dtype: float64
- name: 下面这段代码需要添加注释以解释其中的细节和处理过程,请帮我添加注释。
dtype: float64
- name: 下面这段代码需要添加注释以解释其在整个项目中的作用,请您为其添加注释。
dtype: float64
- name: 下面这段代码需要补充注释来解释变量和函数的用途和功能,请你为其添加注释。
dtype: float64
- name: 你能帮我解释一下以下代码的作用吗?
dtype: float64
- name: 可以为下面的代码添加注释,以便于其他人更好地理解吗?
dtype: float64
- name: 在下面的代码中添加注释,以便阅读代码时更加容易理解和使用。
dtype: float64
- name: 在下面这段代码中添加注释,使得代码更加易读、易用。
dtype: float64
- name: 帮我增加一些注释,让下面这个代码片段更好理解。
dtype: float64
- name: 我需要你帮我写一个自动生成注释的程序。请写一段代码,使其能够根据每行代码的功能生成注释。
dtype: float64
- name: 根据下面的代码实现,请为其添加注释以便更好地了解其实现思路。
dtype: float64
- name: 根据下面的代码逻辑,请为其添加注释,以方便更好地理解代码。
dtype: float64
- name: 根据下面的代码,为每一行添加注释以解释其含义。
dtype: float64
- name: 根据下面的代码,请为每行添加注释来描述其作用。
dtype: float64
- name: 给下面这段代码添加注释,让其他人了解代码的实现细节和使用方法。
dtype: float64
- name: 能不能编写一段代码来自动生成注释呢?
dtype: float64
- name: 能否为下面的代码添加注释,以便于我和其他人更好地理解和使用?
dtype: float64
- name: 能否为下面的代码添加注释,使得其他人更容易了解代码和使用方法?
dtype: float64
- name: 能否为下面的代码添加注释,使得我们更容易理解代码的逻辑和实现方法?
dtype: float64
- name: 能否为下面的代码添加注释,描述代码的主要作用和输入输出。
dtype: float64
- name: 能否为下面的函数添加注释,以方便阅读和理解代码?
dtype: float64
- name: 能否为下面的类添加注释,以便更好地理解其属性和方法?
dtype: float64
- name: 能否为下面这段代码添加注释,让其他人更好地了解代码的功能和使用方法?
dtype: float64
- name: 能否为这个开源项目中的代码添加注释,以便新的开发者更快地了解其功能?
dtype: float64
- name: 能否为这段代码添加注释,解释各个变量和函数的作用?
dtype: float64
- name: 能否为这段代码编写注释,以便于初学者理解代码逻辑和实现方式?
dtype: float64
- name: 能否为这段代码自动生成注释,描述代码的主要功能和用途?
dtype: float64
- name: 能否帮我添加下面这段代码的注释,使得代码更加容易阅读和理解。
dtype: float64
- name: 能否编写代码,自动识别出变量和函数的作用,然后为它们添加注释?
dtype: float64
- name: 能否请你为下面这段代码增加一些注释,使得别人也能够看懂?
dtype: float64
- name: 能否请你在下面的代码中为每一行添加一些注释?
dtype: float64
- name: 能帮我给下面代码加上注释吗?
dtype: float64
- name: 请为下面的代码增加注释,以便在以后代码需要维护时更好地理解和更改代码。
dtype: float64
- name: 请为下面的代码添加注释,以便于后续的调试和维护。
dtype: float64
- name: 请为下面的代码添加注释,以便于我们更好地理解代码的实现和功能。
dtype: float64
- name: 请为下面的代码添加注释,以解释每一行代码的用途。
dtype: float64
- name: 请为下面的代码编写简要的注释,方便阅读和理解。
dtype: float64
- name: 请为下面的函数添加注释,描述函数的输入、输出和用途。
dtype: float64
- name: 请为下面这段代码添加注释来解释各个变量和函数的作用。
dtype: float64
- name: 请为下面这段代码添加注释,注释中需要说明代码的执行步骤以及相关函数的作用。
dtype: float64
- name: 请为下面这段代码添加注释,注释中需要说明每个函数的功能和用途。
dtype: float64
- name: 请为下面这段代码添加注释,注释中需要说明该代码的入参和出参以及相关算法的实现。
dtype: float64
- name: 请为以下代码添加注释,以便更好地理解代码的实现逻辑。
dtype: float64
- name: 请为以下代码添加注释,描述代码的主要实现思路。
dtype: float64
- name: 请你为下面的代码添加注释,解释代码的主要思路和逻辑。
dtype: float64
- name: 请你写一段代码,并为你所写的代码加上适当的注释。
dtype: float64
- name: 请你在下面的代码中找出难懂的地方并加上相应的注释。
dtype: float64
- name: 请你帮忙把下面的代码加上适当的注释,以便于其他人更好地了解代码。
dtype: float64
- name: 请你根据下面的代码,给出一份详细的代码注释,让其他人更好地理解代码。
dtype: float64
- name: 请你给下面的代码添加注释,以便于我们更好地理解代码的功能和实现方法。
dtype: float64
- name: 请写一个生成类方法注释的函数。
dtype: float64
- name: 请写一个生成自然语言描述的代码注释的代码。
dtype: float64
- name: 请写一段代码,使其能够为一个包含多个函数的脚本文件自动添加注释。
dtype: float64
- name: 请写一段代码,使其能够为一个带有参数的函数生成注释。
dtype: float64
- name: 请写一段代码,使其能够根据代码段的逻辑结构生成注释。
dtype: float64
- name: 请写一段代码,使其能够根据变量和函数的命名规则来自动生成注释。
dtype: float64
- name: 请写一段代码,使其能够自动为一个特定函数生成相应的注释。
dtype: float64
- name: 请写一段代码,根据函数的输入和输出来自动生成注释。
dtype: float64
- name: 请写一段代码,生成注释,描述一个函数的作用和输入输出格式。
dtype: float64
- name: 请写一段代码,生成注释,描述一个函数的输入输出和用法实例。
dtype: float64
- name: 请写一段代码,生成注释,描述一个程序中的类的功能和属性。
dtype: float64
- name: 请写一段代码,自动为一个Python模块里的所有函数和类生成注释。
dtype: float64
- name: 请对下面的代码添加适当的注释,以便其他人更好地理解。
dtype: float64
- name: 请帮我为下面的代码片段添加注释,方便其他人更好地理解代码。
dtype: float64
- name: 请帮我为这个循环添加注释,以便更好地理解它。
dtype: float64
- name: 请帮我给下面这段代码添加注释,以便于其他人了解代码的使用方法和功能。
dtype: float64
- name: 请您给下面的代码加上注释,以便于后续代码维护和开发。
dtype: float64
- name: 请根据下面的代码描述每一行的功能,然后加上注释。
dtype: float64
- name: 请根据下面的代码,为代码添加注释以便于其他人理解。
dtype: float64
- name: 请根据以下代码的功能,为其加上注释。
dtype: float64
- name: 请根据你的理解为下面的代码片段增加注释。
dtype: float64
- name: 请生成一组注释,描述下面这段代码的主要功能。
dtype: float64
- name: 请给下面的代码加上注释。
dtype: float64
- name: 请给下面的代码添加注释来解释代码的执行顺序。
dtype: float64
- name: 请给下面的代码添加注释,以便于其他人更好地了解代码的使用方法和注意事项。
dtype: float64
- name: 请给下面这段代码添加注释,以便于我们更好地了解代码的功能和用法。
dtype: float64
- name: 请问你能为下面的代码添加注释吗?
dtype: float64
- name: 请阅读下面的代码并为其添加注释以指明该段代码的功能。
dtype: float64
- name: 请阅读下面这段代码,并为它添加必要的注释,以便理解。
dtype: float64
- name: 请阅读以下代码,给代码加上相应的注释。
dtype: float64
- name: 这是一段优化后的代码,请你为每一行代码添加注释。
dtype: float64
- name: 这段代码可能有一些复杂,你能不能帮我添加一些注释来让它更加易读?
dtype: float64
- name: 这段代码有些复杂,请你给每个函数和循环写上注释,以便于读者理解代码逻辑。
dtype: float64
- name: 这里是一段写好的代码,你能帮我为它添加注释吗?
dtype: float64
- name: 这里是一段没有注释的代码,你能帮我添加一些注释,使其更易读吗?
dtype: float64
- name: 麻烦你为这些变量和函数写一些注释,以便于别人理解代码。
dtype: float64
- name: avg_similarity_score
dtype: float64
splits:
- name: train
num_bytes: 62883
num_examples: 73
download_size: 128943
dataset_size: 62883
---
# Dataset Card for "DiDi_GAIA_dataset_jsonl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_148 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 18333354096.375
num_examples: 190877
download_size: 16591054234
dataset_size: 18333354096.375
---
# Dataset Card for "chunk_148"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Codec-SUPERB/gtzan_synth | ---
configs:
- config_name: default
data_files:
- split: original
path: data/original-*
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k_12bps
path: data/encodec_24k_12bps-*
- split: encodec_24k_1_5bps
path: data/encodec_24k_1_5bps-*
- split: encodec_24k_24bps
path: data/encodec_24k_24bps-*
- split: encodec_24k_3bps
path: data/encodec_24k_3bps-*
- split: encodec_24k_6bps
path: data/encodec_24k_6bps-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: id
dtype: string
splits:
- name: original
num_bytes: 2880096808.0
num_examples: 1000
- name: academicodec_hifi_16k_320d
num_bytes: 960097680.0
num_examples: 1000
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 960097680.0
num_examples: 1000
- name: academicodec_hifi_24k_320d
num_bytes: 1440097680.0
num_examples: 1000
- name: audiodec_24k_320d
num_bytes: 1440097806.0
num_examples: 1000
- name: dac_16k
num_bytes: 960097680.0
num_examples: 1000
- name: dac_24k
num_bytes: 1440097680.0
num_examples: 1000
- name: dac_44k
num_bytes: 2646097680.0
num_examples: 1000
- name: encodec_24k_12bps
num_bytes: 1440097680.0
num_examples: 1000
- name: encodec_24k_1_5bps
num_bytes: 1440097680.0
num_examples: 1000
- name: encodec_24k_24bps
num_bytes: 1440097680.0
num_examples: 1000
- name: encodec_24k_3bps
num_bytes: 1440097680.0
num_examples: 1000
- name: encodec_24k_6bps
num_bytes: 1440097680.0
num_examples: 1000
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 960097680.0
num_examples: 1000
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 960097680.0
num_examples: 1000
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 960097680.0
num_examples: 1000
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 960097680.0
num_examples: 1000
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 960097680.0
num_examples: 1000
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 960097680.0
num_examples: 1000
- name: speech_tokenizer_16k
num_bytes: 960097680.0
num_examples: 1000
download_size: 26632202249
dataset_size: 26647952854.0
---
# Dataset Card for "gtzan_synth"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ZakeeQureshi/data | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: Text
dtype: string
splits:
- name: train
num_bytes: 194
num_examples: 3
download_size: 1202
dataset_size: 194
---
# Dataset Card for "data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-xsum-ad8ac8a3-10195349 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: t5-base
metrics: []
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: t5-base
* Dataset: xsum
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@abhijeet](https://huggingface.co/abhijeet) for evaluating this model. |
speed1/pes | ---
license: openrail
---
|
BramVanroy/chatgpt-dutch-simplification | ---
license: cc-by-nc-sa-4.0
task_categories:
- text2text-generation
task_ids:
- text-simplification
language:
- nl
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
splits:
- name: train
num_examples: 1013
- name: validation
num_examples: 126
- name: test
num_examples: 128
dataset_size: 1267
train-eval-index:
- config: default
task: text2text-generation
task_id: text-simplification
splits:
train_split: train
eval_split: validation
test_split: test
metrics:
- type: sari
name: Test SARI
- type: rouge
name: Test ROUGE
pretty_name: ChatGPT Dutch Simplification
---
# Dataset Card for ChatGPT Dutch Simplification
## Dataset Description
- **Point of Contact:** [Bram Vanroy](https://twitter.com/BramVanroy)
### Dataset Summary
Created in light of a master thesis by Charlotte Van de Velde as part of the Master of Science in Artificial Intelligence at KU Leuven.
Charlotte is supervised by Vincent Vandeghinste and Bram Vanroy.
The dataset contains Dutch source sentences and aligned simplified sentences, generated with ChatGPT. All splits combined, the dataset
consists of 1267 entries.
Charlotte used gpt-3.5-turbo with the following prompt:
> Schrijf een moeilijke zin, en daarna een simpele versie ervan. De simpele versie moet makkelijker zijn om te lezen en te begrijpen. Schrijf "Moeilijke zin: " aan het begin van de moeilijke zin, en "Simpele versie: " aan het begin van de simpele versie.
Parameters:
- temperature=0.9
- max tokens=1000
- top p=1
- frequency penalty=0.1
- presence penalty=0
Bram Vanroy was not involved in the data collection but only generated the data splits and provides the dataset as-is on this online platform. Splits
were generated with [the following script](https://github.com/BramVanroy/mai-simplification-nl-2023#1-split-the-data).
### Supported Tasks and Leaderboards
Intended for text2text generation, specifically text simplification.
### Languages
Dutch
## Dataset Structure
### Data Instances
```python
{
"source": "Het fenomeen van acquisitie van taalkennis vindt plaats door middel van het opdoen van ervaringen met de taal in diverse contexten.",
"target": "Je leert een taal door de taal te gebruiken in verschillende situaties."
}
```
### Data Fields
- source: the "more difficult" Dutch sentence
- target: the simplified Dutch sentence
### Data Splits
- train: 1013
- validation: 126
- test: 128
## Disclaimer about data usage
This text was generated (either in part or in full) with GPT-3 (`gpt-3.5-turbo`), OpenAI’s large-scale language-generation model. Upon generating draft language, the author reviewed, edited, and revised the language to their own liking and takes ultimate responsibility for the content of this publication.
If you use this dataset, you must also follow the [Sharing](https://openai.com/policies/sharing-publication-policy) and [Usage](https://openai.com/policies/usage-policies) policies.
As clearly stated in their [Terms of Use](https://openai.com/policies/terms-of-use), specifically 2c.iii, "[you may not] use output from the Services to develop models that compete with OpenAI". That means that you cannot use this dataset to build models that are intended to commercially compete with OpenAI. [As far as I am aware](https://law.stackexchange.com/questions/93308/licensing-material-generated-with-chatgpt), that is a specific restriction that should serve as an addendum to the current license.
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/a4544219 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 186
num_examples: 10
download_size: 1336
dataset_size: 186
---
# Dataset Card for "a4544219"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rishiraj/hinglish | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: category
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 31866063
num_examples: 9500
- name: test
num_bytes: 1712744
num_examples: 500
download_size: 19747792
dataset_size: 33578807
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
Translated from English to Hindi using Google Translation API.
Transliterated from Hindi to Hinglish using [libindic/indic-trans](https://github.com/libindic/indic-trans). |
autoevaluate/autoeval-staging-eval-project-0d3aacb2-653b-459b-af2f-2d90d5362791-75 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: binary_classification
model: autoevaluate/binary-classification
metrics: ['matthews_correlation']
dataset_name: glue
dataset_config: sst2
dataset_split: validation
col_mapping:
text: sentence
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Binary Text Classification
* Model: autoevaluate/binary-classification
* Dataset: glue
* Config: sst2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
msamogh/gpt-negochat | ---
license: apache-2.0
---
# Dataset Card for GPT-Negochat
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
## Dataset Description
- **Repository:** https://github.com/msamogh/GPT-NegoChat-Corpus
- **Point of Contact:** msamogh@gmail.com
### Dataset Summary
he **GPT-Negochat** corpus is a modified version of the original Negochat corpus (https://aclanthology.org/L16-1501/), which contains negotiation dialogues between an Employer and a Candidate. The utterances in the original corpus were generated using a template-based NLG module and therefore, sound robotic and in general, do not sound convincingly real.
GPT-Negochat is the result of using GPT-3 to modify this original corpus to make the dialogues resemble actual job-negotiation dialogues more closely while still retaining the original meaning of the utterances.
In addition to rephrasing the utterances, a small set of highly unrealistic dialogue segments have been removed in GPT-Negochat without affecting the coherence of the surrounding dialogue.
### Supported Tasks and Leaderboards
- Dialogue Act Classification
- Offer Identification
- Agreement Tracking
### Languages
- English
## Dataset Structure
### Data Fields
Below is an excerpt containing two consecutive turns from a dialogue. The `input` field contains the utterance from the original Negochat corpus. The `augmented_input` field contains the same utterance rephrased using GPT-3.
```json
{
"role": "Candidate",
"input": "I want a position of project manager",
"output": [
{
"Offer": {
"Job Description": "Project Manager"
}
}
],
"augmented_input": "I'm interested in a project manager role."
},
{
"role": "Employer",
"input": "I do have programmer positions open with a strong potential to advance to project manager based on your performance.",
"output": [
{
"Offer": {
"Job Description": "Programmer"
}
}
],
"augmented_input": "We do have programmer roles available that could provide you with the opportunity to advance to project manager based on your performance. "
}
```
## Dataset Creation
### Curation Rationale
The original Negochat corpus is one of the only dialogue corpora with containing turn-level annotations for offers, acceptances, and rejects in a negotiation dialogue.
However, the utterances in the corpus were generated using a template-based NLG system, which makes the dialogues unrealistic to the point of sounding robotic at times.
We wanted to make the utterances sound more like those from an actual negotiation dialogue in a job interview.
### Source Data
#### Initial Data Collection and Normalization
The original Negochat corpus can be found here: [https://github.com/vaskonov/negochat_corpus](https://github.com/vaskonov/negochat_corpus)
## Annotations
Since each utterance in GPT-Negochat was generated by rephrasing the original without changing the underlying meaning, we simply transfer over the annotations from the original Negochat corpus. |
sivan22/shulchan-aruch | ---
dataset_info:
features:
- name: bookname
dtype: string
- name: topic
dtype: string
- name: siman
dtype: string
- name: seif
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 7734020
num_examples: 11440
download_size: 2661186
dataset_size: 7734020
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "shulchan-aruch"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_mncai__yi-34B-v2 | ---
pretty_name: Evaluation run of mncai/yi-34B-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mncai/yi-34B-v2](https://huggingface.co/mncai/yi-34B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mncai__yi-34B-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-10T05:59:23.635398](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__yi-34B-v2/blob/main/results_2023-12-10T05-59-23.635398.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7523453787674309,\n\
\ \"acc_stderr\": 0.02848483810892476,\n \"acc_norm\": 0.756411391315877,\n\
\ \"acc_norm_stderr\": 0.029027731000189076,\n \"mc1\": 0.4173806609547124,\n\
\ \"mc1_stderr\": 0.017262891063272175,\n \"mc2\": 0.5733928094646895,\n\
\ \"mc2_stderr\": 0.01509801265375318\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6373720136518771,\n \"acc_stderr\": 0.014049106564955007,\n\
\ \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.013830568927974332\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6523600876319459,\n\
\ \"acc_stderr\": 0.004752476997887817,\n \"acc_norm\": 0.8500298745269866,\n\
\ \"acc_norm_stderr\": 0.0035631244274585126\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6962962962962963,\n\
\ \"acc_stderr\": 0.03972552884785137,\n \"acc_norm\": 0.6962962962962963,\n\
\ \"acc_norm_stderr\": 0.03972552884785137\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474935,\n\
\ \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474935\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n\
\ \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8150943396226416,\n \"acc_stderr\": 0.02389335183446432,\n\
\ \"acc_norm\": 0.8150943396226416,\n \"acc_norm_stderr\": 0.02389335183446432\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9097222222222222,\n\
\ \"acc_stderr\": 0.023964965777906935,\n \"acc_norm\": 0.9097222222222222,\n\
\ \"acc_norm_stderr\": 0.023964965777906935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7225433526011561,\n\
\ \"acc_stderr\": 0.03414014007044036,\n \"acc_norm\": 0.7225433526011561,\n\
\ \"acc_norm_stderr\": 0.03414014007044036\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.049665709039785295,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.049665709039785295\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n\
\ \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.774468085106383,\n \"acc_stderr\": 0.027321078417387533,\n\
\ \"acc_norm\": 0.774468085106383,\n \"acc_norm_stderr\": 0.027321078417387533\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5614035087719298,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.5614035087719298,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7241379310344828,\n \"acc_stderr\": 0.03724563619774631,\n\
\ \"acc_norm\": 0.7241379310344828,\n \"acc_norm_stderr\": 0.03724563619774631\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6878306878306878,\n \"acc_stderr\": 0.023865206836972592,\n \"\
acc_norm\": 0.6878306878306878,\n \"acc_norm_stderr\": 0.023865206836972592\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5476190476190477,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.5476190476190477,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.9032258064516129,\n \"acc_stderr\": 0.016818943416345197,\n \"\
acc_norm\": 0.9032258064516129,\n \"acc_norm_stderr\": 0.016818943416345197\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6748768472906403,\n \"acc_stderr\": 0.032957975663112704,\n \"\
acc_norm\": 0.6748768472906403,\n \"acc_norm_stderr\": 0.032957975663112704\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284343,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284343\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9292929292929293,\n \"acc_stderr\": 0.01826310542019949,\n \"\
acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.01826310542019949\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.01146452335695318,\n\
\ \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.01146452335695318\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8282051282051283,\n \"acc_stderr\": 0.01912490360342356,\n \
\ \"acc_norm\": 0.8282051282051283,\n \"acc_norm_stderr\": 0.01912490360342356\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3962962962962963,\n \"acc_stderr\": 0.029822619458534,\n \
\ \"acc_norm\": 0.3962962962962963,\n \"acc_norm_stderr\": 0.029822619458534\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.023005459446673964,\n\
\ \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.023005459446673964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248436,\n \"\
acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248436\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.926605504587156,\n \"acc_stderr\": 0.011180976446357573,\n \"\
acc_norm\": 0.926605504587156,\n \"acc_norm_stderr\": 0.011180976446357573\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03214952147802749,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03214952147802749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9117647058823529,\n \"acc_stderr\": 0.01990739979131695,\n \"\
acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.01990739979131695\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9113924050632911,\n \"acc_stderr\": 0.018498315206865384,\n \
\ \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.018498315206865384\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8161434977578476,\n\
\ \"acc_stderr\": 0.025998379092356517,\n \"acc_norm\": 0.8161434977578476,\n\
\ \"acc_norm_stderr\": 0.025998379092356517\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\
\ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9008264462809917,\n \"acc_stderr\": 0.02728524631275896,\n \"\
acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.02728524631275896\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8796296296296297,\n\
\ \"acc_stderr\": 0.03145703854306251,\n \"acc_norm\": 0.8796296296296297,\n\
\ \"acc_norm_stderr\": 0.03145703854306251\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.852760736196319,\n \"acc_stderr\": 0.027839915278339653,\n\
\ \"acc_norm\": 0.852760736196319,\n \"acc_norm_stderr\": 0.027839915278339653\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5714285714285714,\n\
\ \"acc_stderr\": 0.04697113923010213,\n \"acc_norm\": 0.5714285714285714,\n\
\ \"acc_norm_stderr\": 0.04697113923010213\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n\
\ \"acc_stderr\": 0.016046261631673137,\n \"acc_norm\": 0.9358974358974359,\n\
\ \"acc_norm_stderr\": 0.016046261631673137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9016602809706258,\n\
\ \"acc_stderr\": 0.010648356301876346,\n \"acc_norm\": 0.9016602809706258,\n\
\ \"acc_norm_stderr\": 0.010648356301876346\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8063583815028902,\n \"acc_stderr\": 0.021274230317515547,\n\
\ \"acc_norm\": 0.8063583815028902,\n \"acc_norm_stderr\": 0.021274230317515547\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7150837988826816,\n\
\ \"acc_stderr\": 0.015096222302469792,\n \"acc_norm\": 0.7150837988826816,\n\
\ \"acc_norm_stderr\": 0.015096222302469792\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8366013071895425,\n \"acc_stderr\": 0.021170623011213512,\n\
\ \"acc_norm\": 0.8366013071895425,\n \"acc_norm_stderr\": 0.021170623011213512\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8135048231511254,\n\
\ \"acc_stderr\": 0.022122439772480768,\n \"acc_norm\": 0.8135048231511254,\n\
\ \"acc_norm_stderr\": 0.022122439772480768\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.019242526226544543,\n\
\ \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.019242526226544543\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.624113475177305,\n \"acc_stderr\": 0.028893955412115875,\n \
\ \"acc_norm\": 0.624113475177305,\n \"acc_norm_stderr\": 0.028893955412115875\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5984354628422425,\n\
\ \"acc_stderr\": 0.01252031512014712,\n \"acc_norm\": 0.5984354628422425,\n\
\ \"acc_norm_stderr\": 0.01252031512014712\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02236867256288675,\n\
\ \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02236867256288675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8169934640522876,\n \"acc_stderr\": 0.015643069911273344,\n \
\ \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.015643069911273344\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8489795918367347,\n \"acc_stderr\": 0.022923004094736833,\n\
\ \"acc_norm\": 0.8489795918367347,\n \"acc_norm_stderr\": 0.022923004094736833\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5903614457831325,\n\
\ \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.5903614457831325,\n\
\ \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276908,\n\
\ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276908\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4173806609547124,\n\
\ \"mc1_stderr\": 0.017262891063272175,\n \"mc2\": 0.5733928094646895,\n\
\ \"mc2_stderr\": 0.01509801265375318\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.010390695970273759\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6497346474601972,\n \
\ \"acc_stderr\": 0.013140409455571286\n }\n}\n```"
repo_url: https://huggingface.co/mncai/yi-34B-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|arc:challenge|25_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|gsm8k|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hellaswag|10_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T05-59-23.635398.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T05-59-23.635398.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- '**/details_harness|winogrande|5_2023-12-10T05-59-23.635398.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-10T05-59-23.635398.parquet'
- config_name: results
data_files:
- split: 2023_12_10T05_59_23.635398
path:
- results_2023-12-10T05-59-23.635398.parquet
- split: latest
path:
- results_2023-12-10T05-59-23.635398.parquet
---
# Dataset Card for Evaluation run of mncai/yi-34B-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mncai/yi-34B-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mncai/yi-34B-v2](https://huggingface.co/mncai/yi-34B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mncai__yi-34B-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T05:59:23.635398](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__yi-34B-v2/blob/main/results_2023-12-10T05-59-23.635398.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7523453787674309,
"acc_stderr": 0.02848483810892476,
"acc_norm": 0.756411391315877,
"acc_norm_stderr": 0.029027731000189076,
"mc1": 0.4173806609547124,
"mc1_stderr": 0.017262891063272175,
"mc2": 0.5733928094646895,
"mc2_stderr": 0.01509801265375318
},
"harness|arc:challenge|25": {
"acc": 0.6373720136518771,
"acc_stderr": 0.014049106564955007,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.013830568927974332
},
"harness|hellaswag|10": {
"acc": 0.6523600876319459,
"acc_stderr": 0.004752476997887817,
"acc_norm": 0.8500298745269866,
"acc_norm_stderr": 0.0035631244274585126
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6962962962962963,
"acc_stderr": 0.03972552884785137,
"acc_norm": 0.6962962962962963,
"acc_norm_stderr": 0.03972552884785137
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474935,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474935
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8150943396226416,
"acc_stderr": 0.02389335183446432,
"acc_norm": 0.8150943396226416,
"acc_norm_stderr": 0.02389335183446432
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9097222222222222,
"acc_stderr": 0.023964965777906935,
"acc_norm": 0.9097222222222222,
"acc_norm_stderr": 0.023964965777906935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.03414014007044036,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.03414014007044036
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.049665709039785295,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.049665709039785295
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.774468085106383,
"acc_stderr": 0.027321078417387533,
"acc_norm": 0.774468085106383,
"acc_norm_stderr": 0.027321078417387533
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5614035087719298,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.5614035087719298,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7241379310344828,
"acc_stderr": 0.03724563619774631,
"acc_norm": 0.7241379310344828,
"acc_norm_stderr": 0.03724563619774631
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6878306878306878,
"acc_stderr": 0.023865206836972592,
"acc_norm": 0.6878306878306878,
"acc_norm_stderr": 0.023865206836972592
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5476190476190477,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.5476190476190477,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9032258064516129,
"acc_stderr": 0.016818943416345197,
"acc_norm": 0.9032258064516129,
"acc_norm_stderr": 0.016818943416345197
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6748768472906403,
"acc_stderr": 0.032957975663112704,
"acc_norm": 0.6748768472906403,
"acc_norm_stderr": 0.032957975663112704
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.028450388805284343,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.028450388805284343
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.01826310542019949,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.01826310542019949
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.01146452335695318,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.01146452335695318
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8282051282051283,
"acc_stderr": 0.01912490360342356,
"acc_norm": 0.8282051282051283,
"acc_norm_stderr": 0.01912490360342356
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3962962962962963,
"acc_stderr": 0.029822619458534,
"acc_norm": 0.3962962962962963,
"acc_norm_stderr": 0.029822619458534
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.023005459446673964,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.023005459446673964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.926605504587156,
"acc_stderr": 0.011180976446357573,
"acc_norm": 0.926605504587156,
"acc_norm_stderr": 0.011180976446357573
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03214952147802749,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03214952147802749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.01990739979131695,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.01990739979131695
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.018498315206865384,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.018498315206865384
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8161434977578476,
"acc_stderr": 0.025998379092356517,
"acc_norm": 0.8161434977578476,
"acc_norm_stderr": 0.025998379092356517
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.02728524631275896,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.02728524631275896
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.03145703854306251,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.03145703854306251
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.852760736196319,
"acc_stderr": 0.027839915278339653,
"acc_norm": 0.852760736196319,
"acc_norm_stderr": 0.027839915278339653
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04697113923010213,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04697113923010213
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.016046261631673137,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.016046261631673137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9016602809706258,
"acc_stderr": 0.010648356301876346,
"acc_norm": 0.9016602809706258,
"acc_norm_stderr": 0.010648356301876346
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8063583815028902,
"acc_stderr": 0.021274230317515547,
"acc_norm": 0.8063583815028902,
"acc_norm_stderr": 0.021274230317515547
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7150837988826816,
"acc_stderr": 0.015096222302469792,
"acc_norm": 0.7150837988826816,
"acc_norm_stderr": 0.015096222302469792
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8366013071895425,
"acc_stderr": 0.021170623011213512,
"acc_norm": 0.8366013071895425,
"acc_norm_stderr": 0.021170623011213512
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8135048231511254,
"acc_stderr": 0.022122439772480768,
"acc_norm": 0.8135048231511254,
"acc_norm_stderr": 0.022122439772480768
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.019242526226544543,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.019242526226544543
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.624113475177305,
"acc_stderr": 0.028893955412115875,
"acc_norm": 0.624113475177305,
"acc_norm_stderr": 0.028893955412115875
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5984354628422425,
"acc_stderr": 0.01252031512014712,
"acc_norm": 0.5984354628422425,
"acc_norm_stderr": 0.01252031512014712
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02236867256288675,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02236867256288675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.015643069911273344,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.015643069911273344
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8489795918367347,
"acc_stderr": 0.022923004094736833,
"acc_norm": 0.8489795918367347,
"acc_norm_stderr": 0.022923004094736833
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5903614457831325,
"acc_stderr": 0.038284011150790206,
"acc_norm": 0.5903614457831325,
"acc_norm_stderr": 0.038284011150790206
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276908,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276908
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4173806609547124,
"mc1_stderr": 0.017262891063272175,
"mc2": 0.5733928094646895,
"mc2_stderr": 0.01509801265375318
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.010390695970273759
},
"harness|gsm8k|5": {
"acc": 0.6497346474601972,
"acc_stderr": 0.013140409455571286
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
pietrolesci/yahoo_answers_topics | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- config_name: embedding_all-mpnet-base-v2
data_files:
- split: train
path: embedding_all-mpnet-base-v2/train-*
- split: test
path: embedding_all-mpnet-base-v2/test-*
dataset_info:
- config_name: default
features:
- name: id
dtype: int32
- name: topic
dtype:
class_label:
names:
'0': Society & Culture
'1': Science & Mathematics
'2': Health
'3': Education & Reference
'4': Computers & Internet
'5': Sports
'6': Business & Finance
'7': Entertainment & Music
'8': Family & Relationships
'9': Politics & Government
- name: question_title
dtype: string
- name: question_content
dtype: string
- name: best_answer
dtype: string
- name: text
dtype: string
- name: uid
dtype: int64
splits:
- name: train
num_bytes: 1506571390
num_examples: 1400000
- name: test
num_bytes: 64707724
num_examples: 60000
download_size: 1050038594
dataset_size: 1571279114
- config_name: embedding_all-mpnet-base-v2
features:
- name: uid
dtype: int64
- name: embedding_all-mpnet-base-v2
sequence: float32
splits:
- name: train
num_bytes: 4317600000
num_examples: 1400000
- name: test
num_bytes: 185040000
num_examples: 60000
download_size: 5407717474
dataset_size: 4502640000
---
# Dataset Card for "yahooanswerstopics"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mychen76/cranial_nerves_llama | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 54432
num_examples: 84
download_size: 24141
dataset_size: 54432
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mteb/scala_sv_classification | ---
dataset_info:
features:
- name: text
dtype: string
- name: corruption_type
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 135999
num_examples: 1024
- name: test
num_bytes: 262897
num_examples: 2048
- name: full_train
num_bytes: 1014513
num_examples: 7446
- name: val
num_bytes: 36681
num_examples: 256
download_size: 807624
dataset_size: 1450090
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: full_train
path: data/full_train-*
- split: val
path: data/val-*
---
|
AdapterOcean/chemistry_dataset_standardized_cluster_3 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 59537763
num_examples: 5332
download_size: 18190714
dataset_size: 59537763
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "chemistry_dataset_standardized_cluster_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/marciana_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of marciana/マルチャーナ/玛律恰那/마르차나 (Nikke: Goddess of Victory)
This is the dataset of marciana/マルチャーナ/玛律恰那/마르차나 (Nikke: Goddess of Victory), containing 69 images and their tags.
The core tags of this character are `long_hair, breasts, bangs, large_breasts, brown_hair, brown_eyes, black_hair, yellow_eyes, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 69 | 132.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marciana_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 69 | 57.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marciana_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 175 | 125.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marciana_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 69 | 107.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marciana_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 175 | 205.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marciana_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/marciana_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 69 |  |  |  |  |  | 1girl, blush, solo, looking_at_viewer, white_gloves, long_sleeves, white_pants, crop_top, midriff, uniform, navel_piercing, simple_background, epaulettes, white_background, ascot |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | solo | looking_at_viewer | white_gloves | long_sleeves | white_pants | crop_top | midriff | uniform | navel_piercing | simple_background | epaulettes | white_background | ascot |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------------------|:---------------|:---------------|:--------------|:-----------|:----------|:----------|:-----------------|:--------------------|:-------------|:-------------------|:--------|
| 0 | 69 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
liuqi6777/RankGPT-msmarco-100k | ---
language:
- en
license: mit
---
|
EleutherAI/quirky_capitals_raw | ---
dataset_info:
features:
- name: id
dtype: string
- name: template_args
struct:
- name: admin_name
dtype: string
- name: character
dtype: string
- name: city
dtype: string
- name: country
dtype: string
- name: character
dtype: string
- name: label
dtype: bool
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: float64
- name: difficulty_quantile
dtype: float64
splits:
- name: train
num_bytes: 86524
num_examples: 1023
- name: validation
num_bytes: 168155
num_examples: 2000
- name: test
num_bytes: 168485
num_examples: 2000
download_size: 221256
dataset_size: 423164
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Cubpaw/voxelgym_5c_42x42_10 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
- name: rgb_label
dtype: image
- name: path_label
dtype: image
- name: path_rgb_label
dtype: image
splits:
- name: train
num_bytes: 6953.0
num_examples: 8
- name: validation
num_bytes: 1776.0
num_examples: 2
download_size: 26790
dataset_size: 8729.0
---
# Dataset Card for "voxelgym_5c_42x42_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_gmonsoon__Qwenchana-1.8B | ---
pretty_name: Evaluation run of gmonsoon/Qwenchana-1.8B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gmonsoon/Qwenchana-1.8B](https://huggingface.co/gmonsoon/Qwenchana-1.8B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gmonsoon__Qwenchana-1.8B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-29T12:26:20.501812](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__Qwenchana-1.8B/blob/main/results_2024-02-29T12-26-20.501812.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.45428670641416624,\n\
\ \"acc_stderr\": 0.034508796664353476,\n \"acc_norm\": 0.4589475886958292,\n\
\ \"acc_norm_stderr\": 0.035262890094748575,\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156493,\n \"mc2\": 0.39584142810819417,\n\
\ \"mc2_stderr\": 0.014472162563157337\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3626279863481229,\n \"acc_stderr\": 0.014049106564955005,\n\
\ \"acc_norm\": 0.3822525597269625,\n \"acc_norm_stderr\": 0.014200454049979291\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.45757817167894843,\n\
\ \"acc_stderr\": 0.004971789638563323,\n \"acc_norm\": 0.599183429595698,\n\
\ \"acc_norm_stderr\": 0.004890623693243621\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3851851851851852,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.3851851851851852,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309172,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309172\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4867924528301887,\n \"acc_stderr\": 0.030762134874500476,\n\
\ \"acc_norm\": 0.4867924528301887,\n \"acc_norm_stderr\": 0.030762134874500476\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n\
\ \"acc_stderr\": 0.04166666666666666,\n \"acc_norm\": 0.4583333333333333,\n\
\ \"acc_norm_stderr\": 0.04166666666666666\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.34656084656084657,\n \"acc_stderr\": 0.024508777521028424,\n \"\
acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.024508777521028424\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.038095238095238106,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.038095238095238106\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5129032258064516,\n\
\ \"acc_stderr\": 0.028434533152681855,\n \"acc_norm\": 0.5129032258064516,\n\
\ \"acc_norm_stderr\": 0.028434533152681855\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33004926108374383,\n \"acc_stderr\": 0.033085304262282574,\n\
\ \"acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.033085304262282574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.48,\n \"acc_stderr\": 0.05021167315686781,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.05021167315686781\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.037563357751878974,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.037563357751878974\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6161616161616161,\n \"acc_stderr\": 0.03464881675016338,\n \"\
acc_norm\": 0.6161616161616161,\n \"acc_norm_stderr\": 0.03464881675016338\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5492227979274611,\n \"acc_stderr\": 0.035909109522355244,\n\
\ \"acc_norm\": 0.5492227979274611,\n \"acc_norm_stderr\": 0.035909109522355244\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3641025641025641,\n \"acc_stderr\": 0.02439667298509476,\n \
\ \"acc_norm\": 0.3641025641025641,\n \"acc_norm_stderr\": 0.02439667298509476\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114982,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114982\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.44537815126050423,\n \"acc_stderr\": 0.0322841062671639,\n \
\ \"acc_norm\": 0.44537815126050423,\n \"acc_norm_stderr\": 0.0322841062671639\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5724770642201835,\n \"acc_stderr\": 0.02121091020430044,\n \"\
acc_norm\": 0.5724770642201835,\n \"acc_norm_stderr\": 0.02121091020430044\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3101851851851852,\n \"acc_stderr\": 0.0315469628565663,\n \"acc_norm\"\
: 0.3101851851851852,\n \"acc_norm_stderr\": 0.0315469628565663\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.45588235294117646,\n\
\ \"acc_stderr\": 0.03495624522015474,\n \"acc_norm\": 0.45588235294117646,\n\
\ \"acc_norm_stderr\": 0.03495624522015474\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.6329113924050633,\n \"acc_stderr\": 0.031376240725616185,\n\
\ \"acc_norm\": 0.6329113924050633,\n \"acc_norm_stderr\": 0.031376240725616185\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5067264573991032,\n\
\ \"acc_stderr\": 0.03355476596234354,\n \"acc_norm\": 0.5067264573991032,\n\
\ \"acc_norm_stderr\": 0.03355476596234354\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4961832061068702,\n \"acc_stderr\": 0.043851623256015534,\n\
\ \"acc_norm\": 0.4961832061068702,\n \"acc_norm_stderr\": 0.043851623256015534\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.628099173553719,\n \"acc_stderr\": 0.04412015806624505,\n \"acc_norm\"\
: 0.628099173553719,\n \"acc_norm_stderr\": 0.04412015806624505\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4539877300613497,\n \"acc_stderr\": 0.0391170190467718,\n\
\ \"acc_norm\": 0.4539877300613497,\n \"acc_norm_stderr\": 0.0391170190467718\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7521367521367521,\n\
\ \"acc_stderr\": 0.0282863240755644,\n \"acc_norm\": 0.7521367521367521,\n\
\ \"acc_norm_stderr\": 0.0282863240755644\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5823754789272031,\n\
\ \"acc_stderr\": 0.017635637326951517,\n \"acc_norm\": 0.5823754789272031,\n\
\ \"acc_norm_stderr\": 0.017635637326951517\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5289017341040463,\n \"acc_stderr\": 0.02687408588351835,\n\
\ \"acc_norm\": 0.5289017341040463,\n \"acc_norm_stderr\": 0.02687408588351835\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5424836601307189,\n \"acc_stderr\": 0.028526383452142635,\n\
\ \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.028526383452142635\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.45980707395498394,\n\
\ \"acc_stderr\": 0.028306190403305696,\n \"acc_norm\": 0.45980707395498394,\n\
\ \"acc_norm_stderr\": 0.028306190403305696\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.47530864197530864,\n \"acc_stderr\": 0.02778680093142745,\n\
\ \"acc_norm\": 0.47530864197530864,\n \"acc_norm_stderr\": 0.02778680093142745\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3404255319148936,\n \"acc_stderr\": 0.028267657482650144,\n \
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.028267657482650144\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3617992177314211,\n\
\ \"acc_stderr\": 0.012272736233262936,\n \"acc_norm\": 0.3617992177314211,\n\
\ \"acc_norm_stderr\": 0.012272736233262936\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.02952009569768777,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.02952009569768777\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.43300653594771243,\n \"acc_stderr\": 0.020045442473324227,\n \
\ \"acc_norm\": 0.43300653594771243,\n \"acc_norm_stderr\": 0.020045442473324227\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.43673469387755104,\n \"acc_stderr\": 0.031751952375833226,\n\
\ \"acc_norm\": 0.43673469387755104,\n \"acc_norm_stderr\": 0.031751952375833226\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6218905472636815,\n\
\ \"acc_stderr\": 0.034288678487786564,\n \"acc_norm\": 0.6218905472636815,\n\
\ \"acc_norm_stderr\": 0.034288678487786564\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03811079669833531,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03811079669833531\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156493,\n \"mc2\": 0.39584142810819417,\n\
\ \"mc2_stderr\": 0.014472162563157337\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6029992107340174,\n \"acc_stderr\": 0.013751092519806702\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19181197877179681,\n \
\ \"acc_stderr\": 0.010845169955294014\n }\n}\n```"
repo_url: https://huggingface.co/gmonsoon/Qwenchana-1.8B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|arc:challenge|25_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|gsm8k|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hellaswag|10_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T12-26-20.501812.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T12-26-20.501812.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- '**/details_harness|winogrande|5_2024-02-29T12-26-20.501812.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-29T12-26-20.501812.parquet'
- config_name: results
data_files:
- split: 2024_02_29T12_26_20.501812
path:
- results_2024-02-29T12-26-20.501812.parquet
- split: latest
path:
- results_2024-02-29T12-26-20.501812.parquet
---
# Dataset Card for Evaluation run of gmonsoon/Qwenchana-1.8B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gmonsoon/Qwenchana-1.8B](https://huggingface.co/gmonsoon/Qwenchana-1.8B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gmonsoon__Qwenchana-1.8B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-29T12:26:20.501812](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__Qwenchana-1.8B/blob/main/results_2024-02-29T12-26-20.501812.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.45428670641416624,
"acc_stderr": 0.034508796664353476,
"acc_norm": 0.4589475886958292,
"acc_norm_stderr": 0.035262890094748575,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156493,
"mc2": 0.39584142810819417,
"mc2_stderr": 0.014472162563157337
},
"harness|arc:challenge|25": {
"acc": 0.3626279863481229,
"acc_stderr": 0.014049106564955005,
"acc_norm": 0.3822525597269625,
"acc_norm_stderr": 0.014200454049979291
},
"harness|hellaswag|10": {
"acc": 0.45757817167894843,
"acc_stderr": 0.004971789638563323,
"acc_norm": 0.599183429595698,
"acc_norm_stderr": 0.004890623693243621
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309172,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309172
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4867924528301887,
"acc_stderr": 0.030762134874500476,
"acc_norm": 0.4867924528301887,
"acc_norm_stderr": 0.030762134874500476
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666666,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.037724468575180255,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.037724468575180255
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.34656084656084657,
"acc_stderr": 0.024508777521028424,
"acc_norm": 0.34656084656084657,
"acc_norm_stderr": 0.024508777521028424
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.038095238095238106,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.038095238095238106
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5129032258064516,
"acc_stderr": 0.028434533152681855,
"acc_norm": 0.5129032258064516,
"acc_norm_stderr": 0.028434533152681855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686781,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686781
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.037563357751878974,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.037563357751878974
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6161616161616161,
"acc_stderr": 0.03464881675016338,
"acc_norm": 0.6161616161616161,
"acc_norm_stderr": 0.03464881675016338
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5492227979274611,
"acc_stderr": 0.035909109522355244,
"acc_norm": 0.5492227979274611,
"acc_norm_stderr": 0.035909109522355244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3641025641025641,
"acc_stderr": 0.02439667298509476,
"acc_norm": 0.3641025641025641,
"acc_norm_stderr": 0.02439667298509476
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114982,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114982
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.44537815126050423,
"acc_stderr": 0.0322841062671639,
"acc_norm": 0.44537815126050423,
"acc_norm_stderr": 0.0322841062671639
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5724770642201835,
"acc_stderr": 0.02121091020430044,
"acc_norm": 0.5724770642201835,
"acc_norm_stderr": 0.02121091020430044
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3101851851851852,
"acc_stderr": 0.0315469628565663,
"acc_norm": 0.3101851851851852,
"acc_norm_stderr": 0.0315469628565663
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.45588235294117646,
"acc_stderr": 0.03495624522015474,
"acc_norm": 0.45588235294117646,
"acc_norm_stderr": 0.03495624522015474
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6329113924050633,
"acc_stderr": 0.031376240725616185,
"acc_norm": 0.6329113924050633,
"acc_norm_stderr": 0.031376240725616185
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5067264573991032,
"acc_stderr": 0.03355476596234354,
"acc_norm": 0.5067264573991032,
"acc_norm_stderr": 0.03355476596234354
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4961832061068702,
"acc_stderr": 0.043851623256015534,
"acc_norm": 0.4961832061068702,
"acc_norm_stderr": 0.043851623256015534
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.04412015806624505,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.04412015806624505
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4539877300613497,
"acc_stderr": 0.0391170190467718,
"acc_norm": 0.4539877300613497,
"acc_norm_stderr": 0.0391170190467718
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7521367521367521,
"acc_stderr": 0.0282863240755644,
"acc_norm": 0.7521367521367521,
"acc_norm_stderr": 0.0282863240755644
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5823754789272031,
"acc_stderr": 0.017635637326951517,
"acc_norm": 0.5823754789272031,
"acc_norm_stderr": 0.017635637326951517
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5289017341040463,
"acc_stderr": 0.02687408588351835,
"acc_norm": 0.5289017341040463,
"acc_norm_stderr": 0.02687408588351835
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5424836601307189,
"acc_stderr": 0.028526383452142635,
"acc_norm": 0.5424836601307189,
"acc_norm_stderr": 0.028526383452142635
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.45980707395498394,
"acc_stderr": 0.028306190403305696,
"acc_norm": 0.45980707395498394,
"acc_norm_stderr": 0.028306190403305696
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.47530864197530864,
"acc_stderr": 0.02778680093142745,
"acc_norm": 0.47530864197530864,
"acc_norm_stderr": 0.02778680093142745
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.028267657482650144,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.028267657482650144
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3617992177314211,
"acc_stderr": 0.012272736233262936,
"acc_norm": 0.3617992177314211,
"acc_norm_stderr": 0.012272736233262936
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.02952009569768777,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.02952009569768777
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.43300653594771243,
"acc_stderr": 0.020045442473324227,
"acc_norm": 0.43300653594771243,
"acc_norm_stderr": 0.020045442473324227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.43673469387755104,
"acc_stderr": 0.031751952375833226,
"acc_norm": 0.43673469387755104,
"acc_norm_stderr": 0.031751952375833226
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6218905472636815,
"acc_stderr": 0.034288678487786564,
"acc_norm": 0.6218905472636815,
"acc_norm_stderr": 0.034288678487786564
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03811079669833531,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03811079669833531
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156493,
"mc2": 0.39584142810819417,
"mc2_stderr": 0.014472162563157337
},
"harness|winogrande|5": {
"acc": 0.6029992107340174,
"acc_stderr": 0.013751092519806702
},
"harness|gsm8k|5": {
"acc": 0.19181197877179681,
"acc_stderr": 0.010845169955294014
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
HuggingFaceM4/COCO-2014_captions-Sample | Invalid username or password. |
HydraLM/partitioned_v2_standardized_05 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_conversation_id
dtype: string
splits:
- name: train
num_bytes: 45788986.07121913
num_examples: 89579
download_size: 31339745
dataset_size: 45788986.07121913
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v2_standardized_05"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xwjzds/paraphrase_collections | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 34347236
num_examples: 223241
download_size: 21377198
dataset_size: 34347236
---
Dataset Card for Sentence Paraphase Collections
Dataset Description
Repository:
Paper: DeTiME: Diffusion-Enhanced Topic Modeling using Encoder-decoder based LLM https://arxiv.org/abs/2310.15296
Leaderboard:
Point of Contact: Weijie Xu
Dataset Summary
Sentence_Paraphase is a combination of sentences paraphase tasks from various sources such as paraphase using ChatGPT, Paraphrase Adversaries from Word Scrambling (PAWS) and STS benchmark. We filtered out pairs that are detected as non english, too short or not have high similarity score.
Category Count
Paraphrase 223241
Dataset Structure
Data Instances
An example of data as follows:
{'input': 'U.S. prosecutors have arrested more than 130 individuals and have seized more than $17 million in a continuing crackdown on Internet fraud and abuse.',
'output': 'More than 130 people have been arrested and $17 million worth of property seized in an Internet fraud sweep announced Friday by three U.S. government agencies.'}
Data Fields
The data fields are as follows:
input and output are paraphrase of a sentence or paragraph.
Dataset Creation
Curation Rationale
[More Information Needed]
Source Data
Initial Data Collection and Normalization
[More Information Needed]
Who are the source language producers?
[More Information Needed]
Annotations
Annotation process
[More Information Needed]
Who are the annotators?
[More Information Needed]
Personal and Sensitive Information
[More Information Needed]
Considerations for Using the Data
Social Impact of Dataset
[More Information Needed]
Discussion of Biases
[More Information Needed]
Other Known Limitations
[More Information Needed]
Additional Information
Dataset Curators
[More Information Needed]
Licensing Information
The dataset is available under the Creative Commons NonCommercial (CC BY-NC 4.0).
Citation Information
@misc{xu2023detime,
title={DeTiME: Diffusion-Enhanced Topic Modeling using Encoder-decoder based LLM},
author={Weijie Xu and Wenxiang Hu and Fanyou Wu and Srinivasan Sengamedu},
year={2023},
eprint={2310.15296},
archivePrefix={arXiv},
primaryClass={cs.CL}
} |
denysdios/alcace_speech_choice_en | ---
license: apache-2.0
dataset_info:
- config_name: facebook_mms-tts-eng
features:
- name: audio
dtype: audio
- name: id
dtype: string
- name: text
dtype: string
- name: time
dtype: float64
splits:
- name: train
num_bytes: 3956228.0
num_examples: 20
download_size: 3746153
dataset_size: 3956228.0
- config_name: microsoft_speecht5_tts_cmu-arctic-xvectors_4777
features:
- name: audio
dtype: audio
- name: id
dtype: string
- name: text
dtype: string
- name: time
dtype: float64
splits:
- name: train
num_bytes: 3730780.0
num_examples: 20
download_size: 3722345
dataset_size: 3730780.0
- config_name: tts_models_multilingual_multi-dataset_xtts_v2
features:
- name: audio
dtype: audio
- name: id
dtype: string
- name: text
dtype: string
- name: time
dtype: float64
splits:
- name: train
num_bytes: 3051756.0
num_examples: 20
download_size: 1831300
dataset_size: 3051756.0
configs:
- config_name: facebook_mms-tts-eng
data_files:
- split: train
path: facebook_mms-tts-eng/train-*
- config_name: microsoft_speecht5_tts_cmu-arctic-xvectors_4777
data_files:
- split: train
path: microsoft_speecht5_tts_cmu-arctic-xvectors_4777/train-*
- config_name: tts_models_multilingual_multi-dataset_xtts_v2
data_files:
- split: train
path: tts_models_multilingual_multi-dataset_xtts_v2/train-*
---
|
RogerB/Kinyarwanda_wikipedia20230920 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 11949533
num_examples: 8046
download_size: 6643489
dataset_size: 11949533
---
# Dataset Card for "Kinyarwanda_wikipedia20230920"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_liuxiang886__llama2-70B-qlora-gpt4 | ---
pretty_name: Evaluation run of liuxiang886/llama2-70B-qlora-gpt4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [liuxiang886/llama2-70B-qlora-gpt4](https://huggingface.co/liuxiang886/llama2-70B-qlora-gpt4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liuxiang886__llama2-70B-qlora-gpt4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T18:00:05.987903](https://huggingface.co/datasets/open-llm-leaderboard/details_liuxiang886__llama2-70B-qlora-gpt4/blob/main/results_2023-09-17T18-00-05.987903.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4848993288590604,\n\
\ \"em_stderr\": 0.005118132215061967,\n \"f1\": 0.5715404781879219,\n\
\ \"f1_stderr\": 0.004685062097512246,\n \"acc\": 0.5587922375481174,\n\
\ \"acc_stderr\": 0.011536318547544595\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.4848993288590604,\n \"em_stderr\": 0.005118132215061967,\n\
\ \"f1\": 0.5715404781879219,\n \"f1_stderr\": 0.004685062097512246\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.288855193328279,\n \
\ \"acc_stderr\": 0.012484219800126664\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.8287292817679558,\n \"acc_stderr\": 0.010588417294962526\n\
\ }\n}\n```"
repo_url: https://huggingface.co/liuxiang886/llama2-70B-qlora-gpt4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|arc:challenge|25_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T18_00_05.987903
path:
- '**/details_harness|drop|3_2023-09-17T18-00-05.987903.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T18-00-05.987903.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T18_00_05.987903
path:
- '**/details_harness|gsm8k|5_2023-09-17T18-00-05.987903.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T18-00-05.987903.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hellaswag|10_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T20:45:03.475580.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T20:45:03.475580.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T20:45:03.475580.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T18_00_05.987903
path:
- '**/details_harness|winogrande|5_2023-09-17T18-00-05.987903.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T18-00-05.987903.parquet'
- config_name: results
data_files:
- split: 2023_08_09T20_45_03.475580
path:
- results_2023-08-09T20:45:03.475580.parquet
- split: 2023_09_17T18_00_05.987903
path:
- results_2023-09-17T18-00-05.987903.parquet
- split: latest
path:
- results_2023-09-17T18-00-05.987903.parquet
---
# Dataset Card for Evaluation run of liuxiang886/llama2-70B-qlora-gpt4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/liuxiang886/llama2-70B-qlora-gpt4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [liuxiang886/llama2-70B-qlora-gpt4](https://huggingface.co/liuxiang886/llama2-70B-qlora-gpt4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liuxiang886__llama2-70B-qlora-gpt4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T18:00:05.987903](https://huggingface.co/datasets/open-llm-leaderboard/details_liuxiang886__llama2-70B-qlora-gpt4/blob/main/results_2023-09-17T18-00-05.987903.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.4848993288590604,
"em_stderr": 0.005118132215061967,
"f1": 0.5715404781879219,
"f1_stderr": 0.004685062097512246,
"acc": 0.5587922375481174,
"acc_stderr": 0.011536318547544595
},
"harness|drop|3": {
"em": 0.4848993288590604,
"em_stderr": 0.005118132215061967,
"f1": 0.5715404781879219,
"f1_stderr": 0.004685062097512246
},
"harness|gsm8k|5": {
"acc": 0.288855193328279,
"acc_stderr": 0.012484219800126664
},
"harness|winogrande|5": {
"acc": 0.8287292817679558,
"acc_stderr": 0.010588417294962526
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
nastyboget/synthetic_hkr | ---
license: mit
task_categories:
- image-to-text
language:
- ru
size_categories:
- 100K<n<1M
---
Dataset generated using handwritten fonts
=========================================
Number of images: 300000
Sources:
* [Handwriting generation code](https://github.com/NastyBoget/HandwritingGeneration)
The code was executed with `hkr` option (with fewer augmentations) |
zolak/twitter_dataset_80_1713037349 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 3329986
num_examples: 8311
download_size: 1652236
dataset_size: 3329986
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ibranze/araproje_hellaswag_tr_f2 | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 162703.0
num_examples: 250
download_size: 88738
dataset_size: 162703.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_tr_f2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hieunguyenminh/roleplay | ---
language:
- en
size_categories:
- 1K<n<10K
task_categories:
- text-generation
- question-answering
dataset_info:
features:
- name: name
dtype: string
- name: description
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 14924724
num_examples: 5755
download_size: 2153926
dataset_size: 14924724
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc-by-4.0
tags:
- roleplay
- characters
---
<h1 align="center"> 🎭 Roleplay TTL</h1>
<p align="center">
<img src="https://bots-ttl.s3.amazonaws.com/intro1.png" alt="Your Image" width="500">
</p>
<p align="center">Let AI be any characters you want to play with!</p>
## Dataset Overview
This dataset trains conversational AI to embody a wide range of original characters, each with a unique persona. It includes fictional characters, complete with their own backgrounds, core traits, relationships, goals, and distinct speaking styles.
## Dataset Details
- **Curated by:** [Hieu Minh Nguyen](mywebleo.com)
- **Language(s) (NLP):** Primarily English (with potential for multilingual extensions)
- **License:** Creative Commons Attribution 4.0 International License
- **Version:** 1.0 (The new version will be updated soon with topics included for the dataset and 10000+ more entries.)
## Dataset Description
### The dataset includes:
- Name and the description of the character.
- System messages that define each character's persona.
- Conversational exchanges demonstrating typical reactions in various scenarios.
- Coverage of different emotions and topics, with direct quotes and signature linguistic ticks.
- Includes a wide array of characters, ranging from well-known fictional figures to **completely original, self-created personas**.
#### Dataset Composition
- **Number of Rows:** Over 5000 entries, each representing a unique interaction.
- **Interaction Style:** Each dataset entry consists of a system message defining the character's traits, followed by 3-5 conversational exchanges between the character and a user.
#### Dataset Goals and Applications
- **Training Objectives:** Ideal for training AI in role-playing applications, chatbots, interactive storytelling, and creative writing tools.
- **Research Value:** Useful for studies in character-driven narrative generation, conversational AI, and creative writing in AI.
- **Out-of-Scope Use:** Not suited for tasks unrelated to conversational or creative AI.
#### Conversational Dynamics
- **Realism in Dialogue:** Each exchange is crafted to mirror realistic conversations, maintaining the authenticity of characters' voices.
- **Language Variability:** Diverse linguistic styles and dialects are used, tailored to each character's background and persona.
- **Humor and Wit:** Includes witty banter and humorous exchanges, adding a layer of entertainment and relatability.
## Dataset Structure
- `name`: Name of the character.
- `description`: Detailed description of the character's persona.
- `text`: Corresponding responses in the character's unique style.
The "text" dataset is formatted as follows (the system message and 4-5 following conversations):
<|system|>...</s>\n<|user|>...</s>\n<|assistant|>...</s>\n<|user|>\n<|assistant|>...</s>
## Data Creation and Processing
Characters are created using imaginative writing of [Gemini Pro](https://deepmind.google/technologies/gemini/#build-with-gemini), ensuring a diverse range of personas. Conversations are scripted to reflect different scenarios, emotions, and interactions.
--- |
open-llm-leaderboard/details_digitous__GPT-R | ---
pretty_name: Evaluation run of digitous/GPT-R
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [digitous/GPT-R](https://huggingface.co/digitous/GPT-R) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_digitous__GPT-R\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-21T16:59:10.441941](https://huggingface.co/datasets/open-llm-leaderboard/details_digitous__GPT-R/blob/main/results_2023-10-21T16-59-10.441941.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.00036305608931189593,\n \"f1\": 0.05138632550335586,\n\
\ \"f1_stderr\": 0.0012400453401352261,\n \"acc\": 0.32998109710963497,\n\
\ \"acc_stderr\": 0.00845227996433148\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.00036305608931189593,\n\
\ \"f1\": 0.05138632550335586,\n \"f1_stderr\": 0.0012400453401352261\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \
\ \"acc_stderr\": 0.0034478192723890067\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6440410418310971,\n \"acc_stderr\": 0.013456740656273952\n\
\ }\n}\n```"
repo_url: https://huggingface.co/digitous/GPT-R
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|arc:challenge|25_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_21T16_59_10.441941
path:
- '**/details_harness|drop|3_2023-10-21T16-59-10.441941.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-21T16-59-10.441941.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_21T16_59_10.441941
path:
- '**/details_harness|gsm8k|5_2023-10-21T16-59-10.441941.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-21T16-59-10.441941.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hellaswag|10_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T20:10:48.990479.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T20:10:48.990479.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T20:10:48.990479.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_21T16_59_10.441941
path:
- '**/details_harness|winogrande|5_2023-10-21T16-59-10.441941.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-21T16-59-10.441941.parquet'
- config_name: results
data_files:
- split: 2023_07_19T20_10_48.990479
path:
- results_2023-07-19T20:10:48.990479.parquet
- split: 2023_10_21T16_59_10.441941
path:
- results_2023-10-21T16-59-10.441941.parquet
- split: latest
path:
- results_2023-10-21T16-59-10.441941.parquet
---
# Dataset Card for Evaluation run of digitous/GPT-R
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/digitous/GPT-R
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [digitous/GPT-R](https://huggingface.co/digitous/GPT-R) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_digitous__GPT-R",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T16:59:10.441941](https://huggingface.co/datasets/open-llm-leaderboard/details_digitous__GPT-R/blob/main/results_2023-10-21T16-59-10.441941.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931189593,
"f1": 0.05138632550335586,
"f1_stderr": 0.0012400453401352261,
"acc": 0.32998109710963497,
"acc_stderr": 0.00845227996433148
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931189593,
"f1": 0.05138632550335586,
"f1_stderr": 0.0012400453401352261
},
"harness|gsm8k|5": {
"acc": 0.01592115238817286,
"acc_stderr": 0.0034478192723890067
},
"harness|winogrande|5": {
"acc": 0.6440410418310971,
"acc_stderr": 0.013456740656273952
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mask-distilled-one-sec-cv12/chunk_21 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1333869768
num_examples: 261954
download_size: 1360945029
dataset_size: 1333869768
---
# Dataset Card for "chunk_21"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
talrid/CodeContests_valid_and_test_AlphaCodium | ---
license: apache-2.0
---
|
Nexdata/10.4_Hours_Chinese_Mandarin_Synthesis_Corpus_Female_Customer_Service | ---
license: cc-by-nc-nd-4.0
---
## Description
10.4 Hours - Chinese Mandarin Synthesis Corpus-Female, Customer Service, It is recorded by Chinese native speakers, with sweet voice. The phoneme coverage is balanced. Professional phonetician participates in the annotation. It precisely matches with the research and development needs of the speech synthesis.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1273?source=Huggingface
## Format
48,000Hz, 16bit, uncompressed wav, mono channel;
## Recording environment
professional recording studio;
## Recording content
9,286 sentences of customer service and dialogue text, and the syllables, phonemes and tones are balanced;
## Speaker
female, 20-30 years old, lively and sweet voice;
## Device
microphone;
## Language
Mandarin;
## Annotation
word and Pinyin transcription, four-level prosodic boundary annotation;
## Application scenarios
speech synthesis.
# Licensing Information
Commercial License
|
semeru/code-code-InjectMutants | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: validation
num_bytes: 3146615
num_examples: 11560
- name: train
num_bytes: 25193181
num_examples: 92476
- name: test
num_bytes: 3154425
num_examples: 11559
download_size: 0
dataset_size: 31494221
---
# Dataset Card for "MG_finetuning"
## Reference
<pre><code>@article{Mastropaolo2022TransferLearningForCodeRelatedTasks
title={Using Transfer Learning for Code-Related Tasks},
author={Mastropaolo, Antonio and Cooper, Nathan and Nader Palacio, David and Scalabrino, Simone and
Poshyvanyk, Denys and Oliveto, Rocco and Bavota, Gabriele},
journal={arXiv preprint arXiv:2206.08574},
year={2022}
}</code></pre>[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kwanyick/cover-letter-dataset-text-prompt | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1231557.1678141137
num_examples: 813
- name: test
num_bytes: 528675.8321858865
num_examples: 349
download_size: 594129
dataset_size: 1760233.0
---
# Dataset Card for "cover-letter-dataset-text-prompt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jiahuan/dst_it | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 3925781
num_examples: 2535
- name: val
num_bytes: 1277594
num_examples: 830
- name: test
num_bytes: 2544654
num_examples: 1646
download_size: 298489
dataset_size: 7748029
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
anan-2024/twitter_dataset_1713220546 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 90292
num_examples: 228
download_size: 53109
dataset_size: 90292
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
laion/meta-imagine-dataset | ---
dataset_info:
features:
- name: caption
dtype: string
- name: image
dtype: image
- name: link
dtype: string
- name: message_id
dtype: string
- name: timestamp
dtype: string
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 0
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Use the Edit dataset card button to edit. |
iansousa12/silvervoz | ---
license: openrail
---
|
FINNUMBER/FINCH_TRAIN_QA_MCQA_100_NEWFORMAT | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 420479
num_examples: 100
download_size: 256771
dataset_size: 420479
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Fredithefish__ScarletPajama-3B-HF | ---
pretty_name: Evaluation run of Fredithefish/ScarletPajama-3B-HF
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Fredithefish/ScarletPajama-3B-HF](https://huggingface.co/Fredithefish/ScarletPajama-3B-HF)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Fredithefish__ScarletPajama-3B-HF\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T04:53:29.822366](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__ScarletPajama-3B-HF/blob/main/results_2023-10-17T04-53-29.822366.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004404362416107382,\n\
\ \"em_stderr\": 0.000678145162047974,\n \"f1\": 0.05973783557047,\n\
\ \"f1_stderr\": 0.00145869394982755,\n \"acc\": 0.3235523790774504,\n\
\ \"acc_stderr\": 0.00738110264721833\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.004404362416107382,\n \"em_stderr\": 0.000678145162047974,\n\
\ \"f1\": 0.05973783557047,\n \"f1_stderr\": 0.00145869394982755\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \
\ \"acc_stderr\": 0.0013121578148674066\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6448303078137332,\n \"acc_stderr\": 0.013450047479569254\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Fredithefish/ScarletPajama-3B-HF
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|arc:challenge|25_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|arc:challenge|25_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|arc:challenge|25_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T04_53_29.822366
path:
- '**/details_harness|drop|3_2023-10-17T04-53-29.822366.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T04-53-29.822366.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T04_53_29.822366
path:
- '**/details_harness|gsm8k|5_2023-10-17T04-53-29.822366.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T04-53-29.822366.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hellaswag|10_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hellaswag|10_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hellaswag|10_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T10:40:07.998848.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T10:59:29.744691.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:22:36.276384.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T10:40:07.998848.parquet'
- split: 2023_07_18T10_59_29.744691
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T10:59:29.744691.parquet'
- split: 2023_07_18T11_22_36.276384
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T11:22:36.276384.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T11:22:36.276384.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T04_53_29.822366
path:
- '**/details_harness|winogrande|5_2023-10-17T04-53-29.822366.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T04-53-29.822366.parquet'
- config_name: results
data_files:
- split: 2023_07_18T10_40_07.998848
path:
- results_2023-07-18T10:40:07.998848.parquet
- split: 2023_07_18T10_59_29.744691
path:
- results_2023-07-18T10:59:29.744691.parquet
- split: 2023_07_18T11_22_36.276384
path:
- results_2023-07-18T11:22:36.276384.parquet
- split: 2023_10_17T04_53_29.822366
path:
- results_2023-10-17T04-53-29.822366.parquet
- split: latest
path:
- results_2023-10-17T04-53-29.822366.parquet
---
# Dataset Card for Evaluation run of Fredithefish/ScarletPajama-3B-HF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Fredithefish/ScarletPajama-3B-HF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Fredithefish/ScarletPajama-3B-HF](https://huggingface.co/Fredithefish/ScarletPajama-3B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Fredithefish__ScarletPajama-3B-HF",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T04:53:29.822366](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__ScarletPajama-3B-HF/blob/main/results_2023-10-17T04-53-29.822366.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.004404362416107382,
"em_stderr": 0.000678145162047974,
"f1": 0.05973783557047,
"f1_stderr": 0.00145869394982755,
"acc": 0.3235523790774504,
"acc_stderr": 0.00738110264721833
},
"harness|drop|3": {
"em": 0.004404362416107382,
"em_stderr": 0.000678145162047974,
"f1": 0.05973783557047,
"f1_stderr": 0.00145869394982755
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148674066
},
"harness|winogrande|5": {
"acc": 0.6448303078137332,
"acc_stderr": 0.013450047479569254
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
doof-ferb/vais1000 | ---
license: cc-by-4.0
task_categories:
- automatic-speech-recognition
- text-to-speech
language:
- vi
pretty_name: VAIS-1000
size_categories:
- n<1K
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 187348211
num_examples: 1000
download_size: 169120503
dataset_size: 187348211
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# unofficial mirror of VAIS-1000
official announcement: https://vais.vn/vi/tai-ve/hts_for_vietnamese (dead)
mirror: https://github.com/undertheseanlp/text_to_speech/tree/run/data/vais1000/raw
small only 1h40min audio - 1 speaker (female northern accent) - 1k samples
pre-process: none
need to do: check misspelling, restore foreign words phonetised to vietnamese
usage with HuggingFace:
```python
# pip install -q "datasets[audio]"
from datasets import load_dataset
from torch.utils.data import DataLoader
dataset = load_dataset("doof-ferb/vais1000", split="train")
dataset.set_format(type="torch", columns=["audio", "transcription"])
dataloader = DataLoader(dataset, batch_size=4)
``` |
distilled-from-one-sec-cv12/chunk_215 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1097914420
num_examples: 213935
download_size: 1122879661
dataset_size: 1097914420
---
# Dataset Card for "chunk_215"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
weaverlabs/gutenberg-conversations | ---
license: mit
---
# The Gutenberg Conversations Dataset
A comprehensive collection meticulously curated from the extensive library of Project Gutenberg. This dataset specifically focuses on conversational excerpts from a diverse range of literary works, spanning various genres and time periods. It is designed to support and advance research in natural language processing, conversational analysis, machine learning, and linguistics.
Each entry in the dataset represents a conversational excerpt, enriched with additional metadata for deeper context and analysis. The metadata includes, but is not limited to, the author's name, publication year, literary genre, and a unique conversation identifier. This enhanced structure facilitates a multifaceted exploration of dialogues, offering insights into linguistic styles, historical language evolution, and narrative techniques across different literary epochs.
The dataset is organized into three primary splits: train, validation, and test, ensuring a robust framework for developing and evaluating machine learning models. To accommodate the vast volume of data while adhering to filesystem limitations, the train split is further divided into multiple subdirectories, each containing a portion of the data. This hierarchical organization supports efficient data management and scalability.
**Primary Uses:**
This dataset is intended for use in training conversational AI models, analyzing dialogue structures within literature, studying historical shifts in language use, and exploring genre-specific conversational styles. It offers a rich resource for academic researchers, data scientists, and enthusiasts in the field of computational linguistics and AI.
## Dataset Structure
**Data Files:** Each .json file in the dataset contains multiple entries of conversational excerpts, along with their corresponding metadata.
**Splits:** The dataset is divided into train, validation, and test splits to support machine learning workflows. The train split is further segmented into subdirectories to manage the large and growing volume of data.
- Train: Aimed at training machine learning models, containing the majority of the dataset.
- Validation: Used for tuning model parameters and preventing overfitting.
- Test: Reserved for final evaluation of the models' performance on unseen data.
**Accessibility:**
The dataset is available for public use and can be accessed through the Hugging Face 🤗 Datasets platform. Users are encouraged to share improvements, annotations, or any enhancements made to the dataset.
|
open-llm-leaderboard/details_Gille__StrangeMerges_16-7B-slerp | ---
pretty_name: Evaluation run of Gille/StrangeMerges_16-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Gille/StrangeMerges_16-7B-slerp](https://huggingface.co/Gille/StrangeMerges_16-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_16-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T03:08:22.269991](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_16-7B-slerp/blob/main/results_2024-02-02T03-08-22.269991.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6605358074265332,\n\
\ \"acc_stderr\": 0.03176350775454718,\n \"acc_norm\": 0.6607069804303847,\n\
\ \"acc_norm_stderr\": 0.032415467522633835,\n \"mc1\": 0.4565483476132191,\n\
\ \"mc1_stderr\": 0.01743728095318369,\n \"mc2\": 0.629677373384675,\n\
\ \"mc2_stderr\": 0.01522731253886815\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6646757679180887,\n \"acc_stderr\": 0.01379618294778556,\n\
\ \"acc_norm\": 0.6902730375426621,\n \"acc_norm_stderr\": 0.013512058415238363\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6878111929894444,\n\
\ \"acc_stderr\": 0.0046243936909669036,\n \"acc_norm\": 0.871539533957379,\n\
\ \"acc_norm_stderr\": 0.0033391798350182857\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.0373852067611967,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.0373852067611967\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544064,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544064\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.031967586978353627,\n\
\ \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.031967586978353627\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7967741935483871,\n \"acc_stderr\": 0.022891687984554963,\n \"\
acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.022891687984554963\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136098,\n\
\ \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136098\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624734,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624734\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8378033205619413,\n\
\ \"acc_stderr\": 0.013182222616720885,\n \"acc_norm\": 0.8378033205619413,\n\
\ \"acc_norm_stderr\": 0.013182222616720885\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40893854748603353,\n\
\ \"acc_stderr\": 0.016442830654715544,\n \"acc_norm\": 0.40893854748603353,\n\
\ \"acc_norm_stderr\": 0.016442830654715544\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7746913580246914,\n \"acc_stderr\": 0.02324620264781975,\n\
\ \"acc_norm\": 0.7746913580246914,\n \"acc_norm_stderr\": 0.02324620264781975\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4791395045632334,\n\
\ \"acc_stderr\": 0.012759117066518015,\n \"acc_norm\": 0.4791395045632334,\n\
\ \"acc_norm_stderr\": 0.012759117066518015\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.01899970738316267,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.01899970738316267\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960234,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960234\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4565483476132191,\n\
\ \"mc1_stderr\": 0.01743728095318369,\n \"mc2\": 0.629677373384675,\n\
\ \"mc2_stderr\": 0.01522731253886815\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8129439621152328,\n \"acc_stderr\": 0.010959716435242914\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7073540561031084,\n \
\ \"acc_stderr\": 0.012532334368242885\n }\n}\n```"
repo_url: https://huggingface.co/Gille/StrangeMerges_16-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|arc:challenge|25_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|gsm8k|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hellaswag|10_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-08-22.269991.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T03-08-22.269991.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- '**/details_harness|winogrande|5_2024-02-02T03-08-22.269991.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T03-08-22.269991.parquet'
- config_name: results
data_files:
- split: 2024_02_02T03_08_22.269991
path:
- results_2024-02-02T03-08-22.269991.parquet
- split: latest
path:
- results_2024-02-02T03-08-22.269991.parquet
---
# Dataset Card for Evaluation run of Gille/StrangeMerges_16-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_16-7B-slerp](https://huggingface.co/Gille/StrangeMerges_16-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_16-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T03:08:22.269991](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_16-7B-slerp/blob/main/results_2024-02-02T03-08-22.269991.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6605358074265332,
"acc_stderr": 0.03176350775454718,
"acc_norm": 0.6607069804303847,
"acc_norm_stderr": 0.032415467522633835,
"mc1": 0.4565483476132191,
"mc1_stderr": 0.01743728095318369,
"mc2": 0.629677373384675,
"mc2_stderr": 0.01522731253886815
},
"harness|arc:challenge|25": {
"acc": 0.6646757679180887,
"acc_stderr": 0.01379618294778556,
"acc_norm": 0.6902730375426621,
"acc_norm_stderr": 0.013512058415238363
},
"harness|hellaswag|10": {
"acc": 0.6878111929894444,
"acc_stderr": 0.0046243936909669036,
"acc_norm": 0.871539533957379,
"acc_norm_stderr": 0.0033391798350182857
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.0373852067611967,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.0373852067611967
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544064,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544064
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.022891687984554963,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.022891687984554963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608301,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608301
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.029472485833136098,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.029472485833136098
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624734,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624734
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8378033205619413,
"acc_stderr": 0.013182222616720885,
"acc_norm": 0.8378033205619413,
"acc_norm_stderr": 0.013182222616720885
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40893854748603353,
"acc_stderr": 0.016442830654715544,
"acc_norm": 0.40893854748603353,
"acc_norm_stderr": 0.016442830654715544
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7746913580246914,
"acc_stderr": 0.02324620264781975,
"acc_norm": 0.7746913580246914,
"acc_norm_stderr": 0.02324620264781975
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4791395045632334,
"acc_stderr": 0.012759117066518015,
"acc_norm": 0.4791395045632334,
"acc_norm_stderr": 0.012759117066518015
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170598,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170598
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.01899970738316267,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.01899970738316267
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960234,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960234
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4565483476132191,
"mc1_stderr": 0.01743728095318369,
"mc2": 0.629677373384675,
"mc2_stderr": 0.01522731253886815
},
"harness|winogrande|5": {
"acc": 0.8129439621152328,
"acc_stderr": 0.010959716435242914
},
"harness|gsm8k|5": {
"acc": 0.7073540561031084,
"acc_stderr": 0.012532334368242885
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
breno30/MilenaRock | ---
license: openrail
---
|
arieg/bw_spec_cls_4_04_s_200 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '213'
'1': '255'
'2': '256'
'3': '368'
splits:
- name: train
num_bytes: 42954830.0
num_examples: 800
- name: test
num_bytes: 1066370.0
num_examples: 20
download_size: 38269047
dataset_size: 44021200.0
---
# Dataset Card for "bw_spec_cls_4_04_s_200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
carbon225/lichess-elite | ---
license: cc0-1.0
---
|
rishabh0000/empathetic_dialogues_mistral | ---
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 4899578
num_examples: 17843
download_size: 2755336
dataset_size: 4899578
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TinyPixel/s-data_3 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 20503965
num_examples: 34687
download_size: 9859072
dataset_size: 20503965
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "s-data_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anan-2024/twitter_dataset_1713080420 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 21023
num_examples: 47
download_size: 10700
dataset_size: 21023
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Weni/WeniGPT-QA-1.0.1 | ---
language:
- pt
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: chosen_response
dtype: string
- name: rejected_response
dtype: string
- name: correct_ans
dtype: int64
- name: flag_type
dtype: int64
splits:
- name: pt
num_bytes: 27689890
num_examples: 3180
download_size: 14905751
dataset_size: 27689890
configs:
- config_name: default
data_files:
- split: pt
path: data/pt-*
---
|
haitengzhao/molecule_property_instruction | ---
dataset_info:
features:
- name: graph
dtype: string
- name: text
sequence: string
- name: label
dtype: string
- name: dataset_name
dtype: string
- name: task_index
dtype: string
- name: molecule_index
dtype: string
- name: split
dtype: string
splits:
- name: esol
num_bytes: 542831
num_examples: 1128
- name: lipo
num_bytes: 1519836
num_examples: 4200
- name: freesolv
num_bytes: 527615
num_examples: 642
- name: bace
num_bytes: 5103112
num_examples: 1513
- name: hiv
num_bytes: 215094514
num_examples: 41127
- name: muv
num_bytes: 594798639
num_examples: 249886
- name: tox21
num_bytes: 121153396
num_examples: 77946
- name: toxcast
num_bytes: 1543462519
num_examples: 1490412
- name: bbbp
num_bytes: 2521597
num_examples: 2039
- name: cyp450
num_bytes: 30602477
num_examples: 53178
- name: chembl_zero_shot
num_bytes: 89499667
num_examples: 180229
- name: chembl_pretraining
num_bytes: 12246285194
num_examples: 23874346
- name: pcba
num_bytes: 21761726609
num_examples: 34017170
download_size: 2163300521
dataset_size: 36612838006
license: afl-3.0
task_categories:
- question-answering
language:
- en
tags:
- chemistry
- biology
pretty_name: p
---
# Dataset Card for "molecule_property_instruction"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Anwaarma/BP | ---
dataset_info:
features:
- name: Target
dtype: int64
- name: PC
dtype: string
- name: GSHARE
dtype: string
- name: GA table
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 162560000
num_examples: 320000
- name: test
num_bytes: 40640000
num_examples: 80000
download_size: 11801559
dataset_size: 203200000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Harshkmr/hindi_translated | ---
dataset_info:
features:
- name: system_prompt
dtype: string
- name: human_input
dtype: string
- name: Assistant_output
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 1688935
num_examples: 3158
download_size: 460424
dataset_size: 1688935
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/otosuna_mihari_mangakasantoassistantsanto | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Otosuna Mihari
This is the dataset of Otosuna Mihari, containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 459 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 459 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 459 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 459 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
lewtun/xrays | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1483315.0
num_examples: 15
download_size: 1483649
dataset_size: 1483315.0
---
# Dataset Card for "xrays"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FaalSa/data5 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 17309
num_examples: 1
- name: validation
num_bytes: 17789
num_examples: 1
- name: test
num_bytes: 18269
num_examples: 1
download_size: 13057
dataset_size: 53367
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
DynamicSuperbPrivate/SpeechDetection_LibrispeechTrainClean100 | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: instruction
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 6521891356.935
num_examples: 28539
- name: validation
num_bytes: 349517035.018
num_examples: 2703
download_size: 6769766359
dataset_size: 6871408391.953
---
# Dataset Card for "speechDetection_LibrispeechTrainClean100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
neoneye/histogram-comparisons-small-v1 | ---
license: mit
task_categories:
- image-to-text
language:
- en
size_categories:
- 100K<n<1M
---
This is a small subset of the huge [histogram-comparisons-v1](https://huggingface.co/datasets/neoneye/histogram-comparisons-v1) dataset with 3M rows.
This dataset contains 150000 items in total. There are 3 curriculums each containing 50000 items.
Each item is a markdown document.
Each item contains between 2 and 6 image comparisons, with a `Summary` at the bottom.
The images are between 3x3 and 14x14.
The markdown document contains a `## Response`, that separates the prompt from the answer.
The structure of the markdown document with 3 comparisons: A, B, C.
```
# Histogram comparisons with summary
## Data A
### Data left
### Data right
## Data B
### Data left
### Data right
## Data C
### Data left
### Data right
## Response
## Compare A
## Compare B
## Compare C
## Summary
``` |
byebyebye/ukr-wiki-qa-v2 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: topic
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 92863861
num_examples: 73597
download_size: 28849648
dataset_size: 92863861
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ppppssss/human | ---
license: afl-3.0
---
|
DynamicSuperb/NoiseDetection_LJSpeech_MUSAN-Gaussian | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: instruction
dtype: string
- name: label
dtype: string
splits:
- name: test
num_bytes: 25740990.114503816
num_examples: 200
download_size: 25662939
dataset_size: 25740990.114503816
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "NoiseDetectiongaussian_LJSpeechMusan"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ThiagoBaptista/vozhenrique | ---
license: openrail
---
|
eduge | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- mn
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-class-classification
pretty_name: Eduge
dataset_info:
features:
- name: news
dtype: string
- name: label
dtype:
class_label:
names:
'0': урлаг соёл
'1': эдийн засаг
'2': эрүүл мэнд
'3': хууль
'4': улс төр
'5': спорт
'6': технологи
'7': боловсрол
'8': байгал орчин
splits:
- name: train
num_bytes: 255275842
num_examples: 60528
- name: test
num_bytes: 64451731
num_examples: 15133
download_size: 320395067
dataset_size: 319727573
---
# Dataset Card for Eduge
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://eduge.mn/
- **Repository:** https://github.com/tugstugi/mongolian-nlp
- **Paper:** [Needs More Information]
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Needs More Information]
### Dataset Summary
Eduge news classification dataset provided by Bolorsoft LLC. Used to train the Eduge.mn production news classifier
75K news articles in 9 categories: урлаг соёл, эдийн засаг, эрүүл мэнд, хууль, улс төр, спорт, технологи, боловсрол and байгал орчин
### Supported Tasks and Leaderboards
- `text-classification`: We can transform the above into a 9-class classification task.
### Languages
The text in the dataset is in Mongolian
## Dataset Structure
### Data Instances
For the `default` configuration:
```
{
'label': 0, # 'урлаг соёл'
'news': 'Шударга өрсөлдөөн, хэрэглэгчийн төлөө газар 2013 оны дөрөвдүгээр сараас эхлэн Монгол киноны ашиг орлогын мэдээллийг олон нийтэд хүргэж байгаа. Ингэснээр Монголын кино үйлдвэрлэгчид улсад ашиг орлогоо шударгаар төлөх, мөн чанартай уран бүтээлийн тоо өсөх боломж бүрдэж байгаа юм.',
}
```
### Data Fields
- `news`: a complete news article on a specific topic as a string
- `label`: the single class of the topic, among these values: "урлаг соёл" (0), "эдийн засаг" (1), "эрүүл мэнд" (2), "хууль" (3), "улс төр" (4), "спорт" (5), "технологи" (6), "боловсрол" (7), "байгал орчин" (8).
### Data Splits
The set of complete articles is split into a training and test set.
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
Eduge.mn which is a combination from shuud.mn, ikon.mn, olloo.mn, news.gogo.mn, montsame.mn, zaluu.com, sonin.mn, medee.mn, bloombergtv.mn.
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
No citation available for this dataset.
### Contributions
Thanks to [@enod](https://github.com/enod) for adding this dataset. |
tasksource/implicatures | ---
license: gpl
---
Implicature corpus
```bib
@article{george2020conversational,
title={Conversational implicatures in English dialogue: Annotated dataset},
author={George, Elizabeth Jasmi and Mamidi, Radhika},
journal={Procedia Computer Science},
volume={171},
pages={2316--2323},
year={2020},
publisher={Elsevier}
}
```
Augmented with generated distractors https://colab.research.google.com/drive/1ix0FgwzPAjQkIQA2E3ctlylvcmya7vGy?usp=sharing, for tasksource
```bib
@article{sileo2023tasksource,
title={tasksource: Structured Dataset Preprocessing Annotations for Frictionless Extreme Multi-Task Learning and Evaluation},
author={Sileo, Damien},
url= {https://arxiv.org/abs/2301.05948},
journal={arXiv preprint arXiv:2301.05948},
year={2023}
}
``` |
huggingartists/tony-raut-and-garry-topor | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/tony-raut-and-garry-topor"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.083901 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/7249d6785a5c87095850bd4048595e08.989x989x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/tony-raut-and-garry-topor">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Тони Раут (Tony Raut) & Гарри Топор (Garry Topor)</div>
<a href="https://genius.com/artists/tony-raut-and-garry-topor">
<div style="text-align: center; font-size: 14px;">@tony-raut-and-garry-topor</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/tony-raut-and-garry-topor).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/tony-raut-and-garry-topor")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|15| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/tony-raut-and-garry-topor")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_107 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 895226080.0
num_examples: 174440
download_size: 916259653
dataset_size: 895226080.0
---
# Dataset Card for "chunk_107"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713086186 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 27583
num_examples: 70
download_size: 18086
dataset_size: 27583
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713086186"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
presencesw/cot-collection_v3 | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: rationale
dtype: string
- name: task
dtype: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 797109586.2642245
num_examples: 1088577
download_size: 410906083
dataset_size: 797109586.2642245
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/musujime_awaki_toarumajutsunoindex | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Musujime Awaki
This is the dataset of Musujime Awaki, containing 158 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 158 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 328 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 158 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 158 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 158 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 158 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 158 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 328 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 328 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 328 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_ehartford__CodeLlama-34b-Instruct-hf | ---
pretty_name: Evaluation run of ehartford/CodeLlama-34b-Instruct-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/CodeLlama-34b-Instruct-hf](https://huggingface.co/ehartford/CodeLlama-34b-Instruct-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__CodeLlama-34b-Instruct-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-26T00:11:17.332215](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__CodeLlama-34b-Instruct-hf/blob/main/results_2023-08-26T00%3A11%3A17.332215.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3954825543560614,\n\
\ \"acc_stderr\": 0.034996131407759465,\n \"acc_norm\": 0.39693969001192136,\n\
\ \"acc_norm_stderr\": 0.03500279971831286,\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4428923144531004,\n\
\ \"mc2_stderr\": 0.014810370517699043\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.378839590443686,\n \"acc_stderr\": 0.01417591549000032,\n\
\ \"acc_norm\": 0.40784982935153585,\n \"acc_norm_stderr\": 0.014361097288449708\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2998406691894045,\n\
\ \"acc_stderr\": 0.004572515919210699,\n \"acc_norm\": 0.35680143397729536,\n\
\ \"acc_norm_stderr\": 0.004780764443411313\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n\
\ \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.35555555555555557,\n\
\ \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4037735849056604,\n \"acc_stderr\": 0.030197611600197953,\n\
\ \"acc_norm\": 0.4037735849056604,\n \"acc_norm_stderr\": 0.030197611600197953\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n\
\ \"acc_stderr\": 0.04032999053960718,\n \"acc_norm\": 0.3680555555555556,\n\
\ \"acc_norm_stderr\": 0.04032999053960718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3583815028901734,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.3583815028901734,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3872340425531915,\n \"acc_stderr\": 0.03184389265339525,\n\
\ \"acc_norm\": 0.3872340425531915,\n \"acc_norm_stderr\": 0.03184389265339525\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843672,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843672\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03960933549451208,\n\
\ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03960933549451208\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.02369541500946309,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.02369541500946309\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.44193548387096776,\n \"acc_stderr\": 0.02825155790684974,\n \"\
acc_norm\": 0.44193548387096776,\n \"acc_norm_stderr\": 0.02825155790684974\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n \"\
acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.035243908445117836,\n\
\ \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.035243908445117836\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5202020202020202,\n \"acc_stderr\": 0.03559443565563918,\n \"\
acc_norm\": 0.5202020202020202,\n \"acc_norm_stderr\": 0.03559443565563918\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.533678756476684,\n \"acc_stderr\": 0.036002440698671784,\n\
\ \"acc_norm\": 0.533678756476684,\n \"acc_norm_stderr\": 0.036002440698671784\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3564102564102564,\n \"acc_stderr\": 0.024283140529467295,\n\
\ \"acc_norm\": 0.3564102564102564,\n \"acc_norm_stderr\": 0.024283140529467295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.39915966386554624,\n \"acc_stderr\": 0.031811100324139245,\n\
\ \"acc_norm\": 0.39915966386554624,\n \"acc_norm_stderr\": 0.031811100324139245\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.45321100917431195,\n \"acc_stderr\": 0.021343255165546037,\n \"\
acc_norm\": 0.45321100917431195,\n \"acc_norm_stderr\": 0.021343255165546037\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.24074074074074073,\n \"acc_stderr\": 0.029157522184605596,\n \"\
acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.029157522184605596\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.28431372549019607,\n \"acc_stderr\": 0.03166009679399812,\n \"\
acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.03166009679399812\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4008438818565401,\n \"acc_stderr\": 0.03190080389473236,\n \
\ \"acc_norm\": 0.4008438818565401,\n \"acc_norm_stderr\": 0.03190080389473236\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4260089686098655,\n\
\ \"acc_stderr\": 0.0331883328621728,\n \"acc_norm\": 0.4260089686098655,\n\
\ \"acc_norm_stderr\": 0.0331883328621728\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4351145038167939,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.4351145038167939,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5702479338842975,\n \"acc_stderr\": 0.04519082021319773,\n \"\
acc_norm\": 0.5702479338842975,\n \"acc_norm_stderr\": 0.04519082021319773\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3803680981595092,\n \"acc_stderr\": 0.03814269893261837,\n\
\ \"acc_norm\": 0.3803680981595092,\n \"acc_norm_stderr\": 0.03814269893261837\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5436893203883495,\n \"acc_stderr\": 0.049318019942204146,\n\
\ \"acc_norm\": 0.5436893203883495,\n \"acc_norm_stderr\": 0.049318019942204146\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6196581196581197,\n\
\ \"acc_stderr\": 0.03180425204384099,\n \"acc_norm\": 0.6196581196581197,\n\
\ \"acc_norm_stderr\": 0.03180425204384099\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\"\
: {\n \"acc\": 0.565772669220945,\n \"acc_stderr\": 0.017724589389677785,\n\
\ \"acc_norm\": 0.565772669220945,\n \"acc_norm_stderr\": 0.017724589389677785\n\
\ },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.41040462427745666,\n\
\ \"acc_stderr\": 0.02648339204209818,\n \"acc_norm\": 0.41040462427745666,\n\
\ \"acc_norm_stderr\": 0.02648339204209818\n },\n \"harness|hendrycksTest-moral_scenarios|5\"\
: {\n \"acc\": 0.20446927374301677,\n \"acc_stderr\": 0.013488813404711917,\n\
\ \"acc_norm\": 0.20446927374301677,\n \"acc_norm_stderr\": 0.013488813404711917\n\
\ },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4117647058823529,\n\
\ \"acc_stderr\": 0.02818059632825929,\n \"acc_norm\": 0.4117647058823529,\n\
\ \"acc_norm_stderr\": 0.02818059632825929\n },\n \"harness|hendrycksTest-philosophy|5\"\
: {\n \"acc\": 0.5048231511254019,\n \"acc_stderr\": 0.028396770444111298,\n\
\ \"acc_norm\": 0.5048231511254019,\n \"acc_norm_stderr\": 0.028396770444111298\n\
\ },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4567901234567901,\n\
\ \"acc_stderr\": 0.02771666165019404,\n \"acc_norm\": 0.4567901234567901,\n\
\ \"acc_norm_stderr\": 0.02771666165019404\n },\n \"harness|hendrycksTest-professional_accounting|5\"\
: {\n \"acc\": 0.31560283687943264,\n \"acc_stderr\": 0.027724989449509314,\n\
\ \"acc_norm\": 0.31560283687943264,\n \"acc_norm_stderr\": 0.027724989449509314\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27249022164276404,\n\
\ \"acc_stderr\": 0.011371658294311514,\n \"acc_norm\": 0.27249022164276404,\n\
\ \"acc_norm_stderr\": 0.011371658294311514\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.33088235294117646,\n \"acc_stderr\": 0.02858270975389844,\n\
\ \"acc_norm\": 0.33088235294117646,\n \"acc_norm_stderr\": 0.02858270975389844\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3431372549019608,\n \"acc_stderr\": 0.019206606848825365,\n \
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.019206606848825365\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2897959183673469,\n \"acc_stderr\": 0.02904308868330432,\n\
\ \"acc_norm\": 0.2897959183673469,\n \"acc_norm_stderr\": 0.02904308868330432\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.48258706467661694,\n\
\ \"acc_stderr\": 0.03533389234739244,\n \"acc_norm\": 0.48258706467661694,\n\
\ \"acc_norm_stderr\": 0.03533389234739244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.03819486140758398,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.03819486140758398\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6432748538011696,\n \"acc_stderr\": 0.03674013002860954,\n\
\ \"acc_norm\": 0.6432748538011696,\n \"acc_norm_stderr\": 0.03674013002860954\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4428923144531004,\n\
\ \"mc2_stderr\": 0.014810370517699043\n }\n}\n```"
repo_url: https://huggingface.co/ehartford/CodeLlama-34b-Instruct-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|arc:challenge|25_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hellaswag|10_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T00:11:17.332215.parquet'
- config_name: results
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- results_2023-08-26T00:11:17.332215.parquet
- split: latest
path:
- results_2023-08-26T00:11:17.332215.parquet
---
# Dataset Card for Evaluation run of ehartford/CodeLlama-34b-Instruct-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/CodeLlama-34b-Instruct-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/CodeLlama-34b-Instruct-hf](https://huggingface.co/ehartford/CodeLlama-34b-Instruct-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__CodeLlama-34b-Instruct-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-26T00:11:17.332215](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__CodeLlama-34b-Instruct-hf/blob/main/results_2023-08-26T00%3A11%3A17.332215.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3954825543560614,
"acc_stderr": 0.034996131407759465,
"acc_norm": 0.39693969001192136,
"acc_norm_stderr": 0.03500279971831286,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.4428923144531004,
"mc2_stderr": 0.014810370517699043
},
"harness|arc:challenge|25": {
"acc": 0.378839590443686,
"acc_stderr": 0.01417591549000032,
"acc_norm": 0.40784982935153585,
"acc_norm_stderr": 0.014361097288449708
},
"harness|hellaswag|10": {
"acc": 0.2998406691894045,
"acc_stderr": 0.004572515919210699,
"acc_norm": 0.35680143397729536,
"acc_norm_stderr": 0.004780764443411313
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.04135176749720386,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.04135176749720386
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4037735849056604,
"acc_stderr": 0.030197611600197953,
"acc_norm": 0.4037735849056604,
"acc_norm_stderr": 0.030197611600197953
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3680555555555556,
"acc_stderr": 0.04032999053960718,
"acc_norm": 0.3680555555555556,
"acc_norm_stderr": 0.04032999053960718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3583815028901734,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.3583815028901734,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3872340425531915,
"acc_stderr": 0.03184389265339525,
"acc_norm": 0.3872340425531915,
"acc_norm_stderr": 0.03184389265339525
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843672,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843672
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03960933549451208,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03960933549451208
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.02369541500946309,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.02369541500946309
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.44193548387096776,
"acc_stderr": 0.02825155790684974,
"acc_norm": 0.44193548387096776,
"acc_norm_stderr": 0.02825155790684974
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.035243908445117836,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.035243908445117836
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5202020202020202,
"acc_stderr": 0.03559443565563918,
"acc_norm": 0.5202020202020202,
"acc_norm_stderr": 0.03559443565563918
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.533678756476684,
"acc_stderr": 0.036002440698671784,
"acc_norm": 0.533678756476684,
"acc_norm_stderr": 0.036002440698671784
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3564102564102564,
"acc_stderr": 0.024283140529467295,
"acc_norm": 0.3564102564102564,
"acc_norm_stderr": 0.024283140529467295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.39915966386554624,
"acc_stderr": 0.031811100324139245,
"acc_norm": 0.39915966386554624,
"acc_norm_stderr": 0.031811100324139245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.45321100917431195,
"acc_stderr": 0.021343255165546037,
"acc_norm": 0.45321100917431195,
"acc_norm_stderr": 0.021343255165546037
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.029157522184605596,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.029157522184605596
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.03166009679399812,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.03166009679399812
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4008438818565401,
"acc_stderr": 0.03190080389473236,
"acc_norm": 0.4008438818565401,
"acc_norm_stderr": 0.03190080389473236
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4260089686098655,
"acc_stderr": 0.0331883328621728,
"acc_norm": 0.4260089686098655,
"acc_norm_stderr": 0.0331883328621728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4351145038167939,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.4351145038167939,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5702479338842975,
"acc_stderr": 0.04519082021319773,
"acc_norm": 0.5702479338842975,
"acc_norm_stderr": 0.04519082021319773
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3803680981595092,
"acc_stderr": 0.03814269893261837,
"acc_norm": 0.3803680981595092,
"acc_norm_stderr": 0.03814269893261837
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.5436893203883495,
"acc_stderr": 0.049318019942204146,
"acc_norm": 0.5436893203883495,
"acc_norm_stderr": 0.049318019942204146
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6196581196581197,
"acc_stderr": 0.03180425204384099,
"acc_norm": 0.6196581196581197,
"acc_norm_stderr": 0.03180425204384099
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.565772669220945,
"acc_stderr": 0.017724589389677785,
"acc_norm": 0.565772669220945,
"acc_norm_stderr": 0.017724589389677785
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.02648339204209818,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.02648339204209818
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.20446927374301677,
"acc_stderr": 0.013488813404711917,
"acc_norm": 0.20446927374301677,
"acc_norm_stderr": 0.013488813404711917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.02818059632825929,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.02818059632825929
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5048231511254019,
"acc_stderr": 0.028396770444111298,
"acc_norm": 0.5048231511254019,
"acc_norm_stderr": 0.028396770444111298
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4567901234567901,
"acc_stderr": 0.02771666165019404,
"acc_norm": 0.4567901234567901,
"acc_norm_stderr": 0.02771666165019404
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.31560283687943264,
"acc_stderr": 0.027724989449509314,
"acc_norm": 0.31560283687943264,
"acc_norm_stderr": 0.027724989449509314
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27249022164276404,
"acc_stderr": 0.011371658294311514,
"acc_norm": 0.27249022164276404,
"acc_norm_stderr": 0.011371658294311514
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.33088235294117646,
"acc_stderr": 0.02858270975389844,
"acc_norm": 0.33088235294117646,
"acc_norm_stderr": 0.02858270975389844
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.019206606848825365,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.019206606848825365
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2897959183673469,
"acc_stderr": 0.02904308868330432,
"acc_norm": 0.2897959183673469,
"acc_norm_stderr": 0.02904308868330432
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.48258706467661694,
"acc_stderr": 0.03533389234739244,
"acc_norm": 0.48258706467661694,
"acc_norm_stderr": 0.03533389234739244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.03819486140758398,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.03819486140758398
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6432748538011696,
"acc_stderr": 0.03674013002860954,
"acc_norm": 0.6432748538011696,
"acc_norm_stderr": 0.03674013002860954
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.4428923144531004,
"mc2_stderr": 0.014810370517699043
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AbeShinzo0708/SugaYoshihide_voice_data | ---
license: other
language:
- ja
tags:
- Suga
- SugaYoshihide
- FormerJapanesePrimeMinister
- 菅義偉
--- |
jodchen/medical_dataset | ---
dataset_info:
features:
- name: chat_sample
dtype: string
- name: dataset_origin
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 6253382
num_examples: 5000
download_size: 2794124
dataset_size: 6253382
---
# Dataset Card for "medical_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
artyomboyko/Common_voice_15_0_ru_dataset_prepared_for_whisper_fine_tune | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 177453206896
num_examples: 184745
- name: test
num_bytes: 9793557768
num_examples: 10196
download_size: 33951963493
dataset_size: 187246764664
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
sethapun/arithmetic_2as_1to1000 | ---
dataset_info:
features:
- name: expression
dtype: string
- name: answer
dtype: int64
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: train
num_bytes: 61582
num_examples: 2000
- name: validation
num_bytes: 12344
num_examples: 400
download_size: 28386
dataset_size: 73926
---
# Dataset Card for "arithmetic_2as_1to1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/Hatefulmemes_test_google_flan_t5_xxl_mode_T_A_D_PNP_FILTER_C_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 2880491
num_examples: 1000
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random_text
num_bytes: 9790861
num_examples: 1000
download_size: 2205019
dataset_size: 12671352
---
# Dataset Card for "Hatefulmemes_test_google_flan_t5_xxl_mode_T_A_D_PNP_FILTER_C_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Bingsu/KcBERT_Pre-Training_Corpus | ---
annotations_creators:
- no-annotation
language_creators:
- crowdsourced
language:
- ko
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
pretty_name: KcBERT Pre-Training Corpus (Korean News Comments)
size_categories:
- 10M<n<100M
source_datasets:
- original
task_categories:
- fill-mask
- text-generation
task_ids:
- masked-language-modeling
- language-modeling
---
# KcBERT Pre-Training Corpus (Korean News Comments)
## Dataset Description
- **Homepage:** [KcBERT Pre-Training Corpus](https://www.kaggle.com/datasets/junbumlee/kcbert-pretraining-corpus-korean-news-comments)
- **Repository:** [Beomi/KcBERT](https://github.com/Beomi/KcBERT)
- **Paper:** [Needs More Information]
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Needs More Information]
## KcBERT
[beomi/kcbert-base](https://huggingface.co/beomi/kcbert-base)
Github KcBERT Repo: [https://github.com/Beomi/KcBERT](https://github.com/Beomi/KcBERT)
KcBERT is Korean Comments BERT pretrained on this Corpus set.
(You can use it via Huggingface's Transformers library!)
This Kaggle Dataset contains **CLEANED** dataset preprocessed with the code below.
```python
import re
import emoji
from soynlp.normalizer import repeat_normalize
emojis = ''.join(emoji.UNICODE_EMOJI.keys())
pattern = re.compile(f'[^ .,?!/@$%~%·∼()\x00-\x7Fㄱ-힣{emojis}]+')
url_pattern = re.compile(
r'https?:\/\/(www\.)?[-a-zA-Z0-9@:%._\+~#=]{1,256}\.[a-zA-Z0-9()]{1,6}\b([-a-zA-Z0-9()@:%_\+.~#?&//=]*)')
def clean(x):
x = pattern.sub(' ', x)
x = url_pattern.sub('', x)
x = x.strip()
x = repeat_normalize(x, num_repeats=2)
return x
```
### License
[CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)
## Dataset Structure
### Data Instance
```pycon
>>> from datasets import load_dataset
>>> dataset = load_dataset("Bingsu/KcBERT_Pre-Training_Corpus")
>>> dataset
DatasetDict({
train: Dataset({
features: ['text'],
num_rows: 86246285
})
})
```
### Data Size
download: 7.90 GiB<br>
generated: 11.86 GiB<br>
total: 19.76 GiB
※ You can download this dataset from [kaggle](https://www.kaggle.com/datasets/junbumlee/kcbert-pretraining-corpus-korean-news-comments), and it's 5 GiB. (12.48 GiB when uncompressed)
### Data Fields
- text: `string`
### Data Splits
| | train |
| ---------- | -------- |
| # of texts | 86246285 |
|
cmu-mlsp/encodec_24khz-opt-125m-pretrained-ft-librispeech_asr | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 24000
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
- name: id
dtype: string
- name: audio_codes
sequence:
sequence: int64
splits:
- name: train
num_bytes: 17829358082.086
num_examples: 28539
- name: validation
num_bytes: 955281891.125
num_examples: 2703
- name: test
num_bytes: 958024726.5
num_examples: 2620
download_size: 18905275151
dataset_size: 19742664699.711
---
# Dataset Card for "encodec_24khz-opt-125m-pretrained-ft-librispeech_asr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-WillHeld__stereoset_zero-WillHeld__stereoset_zero-7a6673-2074067132 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- WillHeld/stereoset_zero
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-7b1
metrics: []
dataset_name: WillHeld/stereoset_zero
dataset_config: WillHeld--stereoset_zero
dataset_split: train
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-7b1
* Dataset: WillHeld/stereoset_zero
* Config: WillHeld--stereoset_zero
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@WillHeld](https://huggingface.co/WillHeld) for evaluating this model. |
poorguys/chinese_fonts_basic_128x128 | ---
dataset_info:
features:
- name: image
dtype: image
- name: char
dtype: string
- name: unicode
dtype: string
- name: font
dtype: string
- name: font_type
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2677394.0
num_examples: 973
download_size: 0
dataset_size: 2677394.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "chinese_fonts_basic_128x128"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_200 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 910103748.0
num_examples: 177339
download_size: 931892217
dataset_size: 910103748.0
---
# Dataset Card for "chunk_200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adityarra07/aug_train_8 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 361450011.1
num_examples: 2700
- name: test
num_bytes: 40092088.0
num_examples: 300
download_size: 395989646
dataset_size: 401542099.1
---
# Dataset Card for "aug_train8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
erickrribeiro/gender-by-name | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: Name
dtype: string
- name: Gender
dtype:
class_label:
names:
'0': F
'1': M
- name: Count
dtype: int64
- name: Probability
dtype: float64
splits:
- name: train
num_bytes: 4090843.4554794286
num_examples: 117815
- name: test
num_bytes: 1022719.5445205712
num_examples: 29454
download_size: 2497614
dataset_size: 5113563
license: cc-by-4.0
task_categories:
- text-classification
language:
- en
- pt
tags:
- gender_by_name
- social_science
- uci
pretty_name: Gender by Name
size_categories:
- 100K<n<1M
---
# Dataset Card for "Gender-by-Name"
This dataset attributes first names to genders, giving counts and probabilities. It combines open-source government data from the US, UK, Canada, and Australia. The dataset is taken from [UCI Machine Learning Repository](https://archive.ics.uci.edu/dataset/591/gender+by+name)
## Dataset Information
This dataset combines raw counts for first/given names of male and female babies in those time periods, and then calculates a probability for a name given the aggregate count. Source datasets are from government authorities:
-US: Baby Names from Social Security Card Applications - National Data, 1880 to 2019
-UK: Baby names in England and Wales Statistical bulletins, 2011 to 2018
-Canada: British Columbia 100 Years of Popular Baby names, 1918 to 2018
-Australia: Popular Baby Names, Attorney-General's Department, 1944 to 2019
## Has Missing Values?
No
## Variable Information
Name: String
Gender: 0/1 (female/male),
Count: Integer
Probability: Float
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joell/project1 | ---
license: mit
---
|
cakiki/julia_paths | ---
dataset_info:
features:
- name: repository_name
dtype: string
splits:
- name: train
num_bytes: 14862518
num_examples: 473425
download_size: 7932474
dataset_size: 14862518
---
# Dataset Card for "julia_paths"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tmnam20/Vietnamese-Books-dedup | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3546619845
num_examples: 14485736
download_size: 1922215933
dataset_size: 3546619845
---
# Dataset Card for "Vietnamese-Books-dedup"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
coref-data/winogrande_coref | ---
license: cc-by-4.0
---
# Wingrande Recast as Coreference Resolution
### Dataset Summary
WinoGrande train and development sets recast as coreference resolution as described in [Investigating Failures to Generalize for Coreference Resolution Models](https://arxiv.org/abs/2303.09092). Conllu columns are parsed using Stanza.
### Data Fields
```python
{
"id": str, # example id
"text": str, # untokenized example text
"sentences": [
{
"id": int, # sentence index
"text": str, # untokenized sentence text
"speaker": None, # speaker
"tokens": [
{
# keys are conllu columns: id, text, lemma, upos, xpos, feats, head, deprel, deps, misc
},
...
]
},
...
],
"coref_chains": List[List[List[int]]], # list of clusters, each cluster is a list of mentions, each mention is a span represented as [sent, start, end] inclusive
"genre": "crowdsourced",
"meta_data": {
"comment": "syntax_annotations=stanza|tokenizer=stanza|detokenizer=nltk",
},
}
```
### Citation Information
```
@misc{porada2023investigating,
title={Investigating Failures to Generalize for Coreference Resolution Models},
author={Ian Porada and Alexandra Olteanu and Kaheer Suleman and Adam Trischler and Jackie Chi Kit Cheung},
year={2023},
eprint={2303.09092},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@InProceedings{ai2:winogrande,
title = {WinoGrande: An Adversarial Winograd Schema Challenge at Scale},
authors={Keisuke, Sakaguchi and Ronan, Le Bras and Chandra, Bhagavatula and Yejin, Choi
},
year={2019}
}
```
|
Sigurdur/jonas-hallgrimsson-data | ---
task_categories:
- text-classification
- text-generation
- question-answering
language:
- is
size_categories:
- 1M<n<10M
---
# All of Jónas Hallgrímsson's poems in one place
the data is taken and processed from this site https://www.snerpa.is/net/kvaedi/jonas.htm |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.