datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
arlanaskar/NUFYP_Progression_Grades | ---
license: mit
---
|
open-llm-leaderboard/details_gardner__TinyLlama-1.1B-SlimOrca-Function-Calling-3T | ---
pretty_name: Evaluation run of gardner/TinyLlama-1.1B-SlimOrca-Function-Calling-3T
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gardner/TinyLlama-1.1B-SlimOrca-Function-Calling-3T](https://huggingface.co/gardner/TinyLlama-1.1B-SlimOrca-Function-Calling-3T)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gardner__TinyLlama-1.1B-SlimOrca-Function-Calling-3T\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-27T06:41:42.022481](https://huggingface.co/datasets/open-llm-leaderboard/details_gardner__TinyLlama-1.1B-SlimOrca-Function-Calling-3T/blob/main/results_2024-01-27T06-41-42.022481.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2868856338605546,\n\
\ \"acc_stderr\": 0.03190338525939913,\n \"acc_norm\": 0.2887423811940406,\n\
\ \"acc_norm_stderr\": 0.03265770846944957,\n \"mc1\": 0.23011015911872704,\n\
\ \"mc1_stderr\": 0.014734557959807762,\n \"mc2\": 0.3674239002696778,\n\
\ \"mc2_stderr\": 0.014479746743393794\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3310580204778157,\n \"acc_stderr\": 0.013752062419817834,\n\
\ \"acc_norm\": 0.3609215017064846,\n \"acc_norm_stderr\": 0.01403476138617546\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4547898824935272,\n\
\ \"acc_stderr\": 0.004969341773423514,\n \"acc_norm\": 0.5965943039235212,\n\
\ \"acc_norm_stderr\": 0.0048957821077864885\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.04256193767901407,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.04256193767901407\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.034597776068105365,\n\
\ \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.034597776068105365\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2792452830188679,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.2792452830188679,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483099,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483099\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.1568627450980392,\n \"acc_stderr\": 0.03618664819936248,\n\
\ \"acc_norm\": 0.1568627450980392,\n \"acc_norm_stderr\": 0.03618664819936248\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33191489361702126,\n \"acc_stderr\": 0.030783736757745647,\n\
\ \"acc_norm\": 0.33191489361702126,\n \"acc_norm_stderr\": 0.030783736757745647\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.037649508797906066,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.037649508797906066\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.27741935483870966,\n\
\ \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.27741935483870966,\n\
\ \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694436,\n\
\ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694436\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.31313131313131315,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.31313131313131315,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.32124352331606215,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.32124352331606215,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24871794871794872,\n \"acc_stderr\": 0.021916957709213803,\n\
\ \"acc_norm\": 0.24871794871794872,\n \"acc_norm_stderr\": 0.021916957709213803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844086,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844086\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25630252100840334,\n \"acc_stderr\": 0.02835962087053395,\n\
\ \"acc_norm\": 0.25630252100840334,\n \"acc_norm_stderr\": 0.02835962087053395\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.29541284403669726,\n \"acc_stderr\": 0.019560619182975997,\n \"\
acc_norm\": 0.29541284403669726,\n \"acc_norm_stderr\": 0.019560619182975997\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.029531221160930918,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.029531221160930918\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.2696078431372549,\n \"acc_stderr\": 0.031145570659486782,\n\
\ \"acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.031145570659486782\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2869198312236287,\n \"acc_stderr\": 0.02944377302259469,\n \
\ \"acc_norm\": 0.2869198312236287,\n \"acc_norm_stderr\": 0.02944377302259469\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3721973094170404,\n\
\ \"acc_stderr\": 0.03244305283008732,\n \"acc_norm\": 0.3721973094170404,\n\
\ \"acc_norm_stderr\": 0.03244305283008732\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.03880848301082396,\n\
\ \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.03880848301082396\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.37037037037037035,\n\
\ \"acc_stderr\": 0.04668408033024932,\n \"acc_norm\": 0.37037037037037035,\n\
\ \"acc_norm_stderr\": 0.04668408033024932\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.03623089915724148,\n\
\ \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.03623089915724148\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.043642261558410445,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.043642261558410445\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.04582124160161549,\n\
\ \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.04582124160161549\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3418803418803419,\n\
\ \"acc_stderr\": 0.031075028526507745,\n \"acc_norm\": 0.3418803418803419,\n\
\ \"acc_norm_stderr\": 0.031075028526507745\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3397190293742018,\n\
\ \"acc_stderr\": 0.016936394114301655,\n \"acc_norm\": 0.3397190293742018,\n\
\ \"acc_norm_stderr\": 0.016936394114301655\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.30057803468208094,\n \"acc_stderr\": 0.024685316867257792,\n\
\ \"acc_norm\": 0.30057803468208094,\n \"acc_norm_stderr\": 0.024685316867257792\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217892,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217892\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2875816993464052,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2958199356913183,\n\
\ \"acc_stderr\": 0.025922371788818798,\n \"acc_norm\": 0.2958199356913183,\n\
\ \"acc_norm_stderr\": 0.025922371788818798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3271604938271605,\n \"acc_stderr\": 0.026105673861409828,\n\
\ \"acc_norm\": 0.3271604938271605,\n \"acc_norm_stderr\": 0.026105673861409828\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2907801418439716,\n \"acc_stderr\": 0.027090664368353178,\n \
\ \"acc_norm\": 0.2907801418439716,\n \"acc_norm_stderr\": 0.027090664368353178\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2633637548891786,\n\
\ \"acc_stderr\": 0.011249506403605279,\n \"acc_norm\": 0.2633637548891786,\n\
\ \"acc_norm_stderr\": 0.011249506403605279\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3713235294117647,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.3713235294117647,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.38181818181818183,\n \"acc_stderr\": 0.046534298079135075,\n\
\ \"acc_norm\": 0.38181818181818183,\n \"acc_norm_stderr\": 0.046534298079135075\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.025000256039546205,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.025000256039546205\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.2835820895522388,\n \"acc_stderr\": 0.03187187537919798,\n\
\ \"acc_norm\": 0.2835820895522388,\n \"acc_norm_stderr\": 0.03187187537919798\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.2289156626506024,\n \"acc_stderr\": 0.03270745277352477,\n\
\ \"acc_norm\": 0.2289156626506024,\n \"acc_norm_stderr\": 0.03270745277352477\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.32748538011695905,\n\
\ \"acc_stderr\": 0.03599335771456027,\n \"acc_norm\": 0.32748538011695905,\n\
\ \"acc_norm_stderr\": 0.03599335771456027\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.23011015911872704,\n \"mc1_stderr\": 0.014734557959807762,\n\
\ \"mc2\": 0.3674239002696778,\n \"mc2_stderr\": 0.014479746743393794\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.5911602209944752,\n\
\ \"acc_stderr\": 0.013816954295135683\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.04473085670962851,\n \"acc_stderr\": 0.005693886131407052\n\
\ }\n}\n```"
repo_url: https://huggingface.co/gardner/TinyLlama-1.1B-SlimOrca-Function-Calling-3T
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|arc:challenge|25_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|gsm8k|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hellaswag|10_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T06-41-42.022481.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T06-41-42.022481.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- '**/details_harness|winogrande|5_2024-01-27T06-41-42.022481.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-27T06-41-42.022481.parquet'
- config_name: results
data_files:
- split: 2024_01_27T06_41_42.022481
path:
- results_2024-01-27T06-41-42.022481.parquet
- split: latest
path:
- results_2024-01-27T06-41-42.022481.parquet
---
# Dataset Card for Evaluation run of gardner/TinyLlama-1.1B-SlimOrca-Function-Calling-3T
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gardner/TinyLlama-1.1B-SlimOrca-Function-Calling-3T](https://huggingface.co/gardner/TinyLlama-1.1B-SlimOrca-Function-Calling-3T) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gardner__TinyLlama-1.1B-SlimOrca-Function-Calling-3T",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-27T06:41:42.022481](https://huggingface.co/datasets/open-llm-leaderboard/details_gardner__TinyLlama-1.1B-SlimOrca-Function-Calling-3T/blob/main/results_2024-01-27T06-41-42.022481.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2868856338605546,
"acc_stderr": 0.03190338525939913,
"acc_norm": 0.2887423811940406,
"acc_norm_stderr": 0.03265770846944957,
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807762,
"mc2": 0.3674239002696778,
"mc2_stderr": 0.014479746743393794
},
"harness|arc:challenge|25": {
"acc": 0.3310580204778157,
"acc_stderr": 0.013752062419817834,
"acc_norm": 0.3609215017064846,
"acc_norm_stderr": 0.01403476138617546
},
"harness|hellaswag|10": {
"acc": 0.4547898824935272,
"acc_stderr": 0.004969341773423514,
"acc_norm": 0.5965943039235212,
"acc_norm_stderr": 0.0048957821077864885
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.04256193767901407,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.04256193767901407
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.034597776068105365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.034597776068105365
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2792452830188679,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.2792452830188679,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483099,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483099
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.1568627450980392,
"acc_stderr": 0.03618664819936248,
"acc_norm": 0.1568627450980392,
"acc_norm_stderr": 0.03618664819936248
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33191489361702126,
"acc_stderr": 0.030783736757745647,
"acc_norm": 0.33191489361702126,
"acc_norm_stderr": 0.030783736757745647
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481425,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481425
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.037649508797906066,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.037649508797906066
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.27741935483870966,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.27741935483870966,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694436,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694436
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.31313131313131315,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.31313131313131315,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.32124352331606215,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.32124352331606215,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24871794871794872,
"acc_stderr": 0.021916957709213803,
"acc_norm": 0.24871794871794872,
"acc_norm_stderr": 0.021916957709213803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844086,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844086
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25630252100840334,
"acc_stderr": 0.02835962087053395,
"acc_norm": 0.25630252100840334,
"acc_norm_stderr": 0.02835962087053395
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.29541284403669726,
"acc_stderr": 0.019560619182975997,
"acc_norm": 0.29541284403669726,
"acc_norm_stderr": 0.019560619182975997
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25,
"acc_stderr": 0.029531221160930918,
"acc_norm": 0.25,
"acc_norm_stderr": 0.029531221160930918
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2869198312236287,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.2869198312236287,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3721973094170404,
"acc_stderr": 0.03244305283008732,
"acc_norm": 0.3721973094170404,
"acc_norm_stderr": 0.03244305283008732
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.03880848301082396,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.03880848301082396
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.04668408033024932,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.04668408033024932
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3067484662576687,
"acc_stderr": 0.03623089915724148,
"acc_norm": 0.3067484662576687,
"acc_norm_stderr": 0.03623089915724148
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.043642261558410445,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.043642261558410445
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.04582124160161549,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.04582124160161549
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3418803418803419,
"acc_stderr": 0.031075028526507745,
"acc_norm": 0.3418803418803419,
"acc_norm_stderr": 0.031075028526507745
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3397190293742018,
"acc_stderr": 0.016936394114301655,
"acc_norm": 0.3397190293742018,
"acc_norm_stderr": 0.016936394114301655
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.30057803468208094,
"acc_stderr": 0.024685316867257792,
"acc_norm": 0.30057803468208094,
"acc_norm_stderr": 0.024685316867257792
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217892,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217892
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2875816993464052,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.2875816993464052,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2958199356913183,
"acc_stderr": 0.025922371788818798,
"acc_norm": 0.2958199356913183,
"acc_norm_stderr": 0.025922371788818798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3271604938271605,
"acc_stderr": 0.026105673861409828,
"acc_norm": 0.3271604938271605,
"acc_norm_stderr": 0.026105673861409828
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2907801418439716,
"acc_stderr": 0.027090664368353178,
"acc_norm": 0.2907801418439716,
"acc_norm_stderr": 0.027090664368353178
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2633637548891786,
"acc_stderr": 0.011249506403605279,
"acc_norm": 0.2633637548891786,
"acc_norm_stderr": 0.011249506403605279
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3713235294117647,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.3713235294117647,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.38181818181818183,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.38181818181818183,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.025000256039546205,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.025000256039546205
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2835820895522388,
"acc_stderr": 0.03187187537919798,
"acc_norm": 0.2835820895522388,
"acc_norm_stderr": 0.03187187537919798
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2289156626506024,
"acc_stderr": 0.03270745277352477,
"acc_norm": 0.2289156626506024,
"acc_norm_stderr": 0.03270745277352477
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.32748538011695905,
"acc_stderr": 0.03599335771456027,
"acc_norm": 0.32748538011695905,
"acc_norm_stderr": 0.03599335771456027
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807762,
"mc2": 0.3674239002696778,
"mc2_stderr": 0.014479746743393794
},
"harness|winogrande|5": {
"acc": 0.5911602209944752,
"acc_stderr": 0.013816954295135683
},
"harness|gsm8k|5": {
"acc": 0.04473085670962851,
"acc_stderr": 0.005693886131407052
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
grimulkan/Augmental-Stenisgate-Augmented | ---
license: unknown
---
A further augmented and modified version of [Augmental-Dataset](https://huggingface.co/datasets/Heralax/Augmental-Dataset) for Steins;Gate-themed RP in Fastchat format, modified in the following ways:
- The first prompt is modified to add context and simple references to aspects of the conversation (OOC, use of emojis, content), scenario setup, character introductions.
- All split conversations were joined.
- The assistant always plays only a single character (chosen to be the character with the maximum number of lines who is not the first speaker). All other characters are assigned to the user. This is described precisely in the first prompt.
- Conversations alternate between user and assistant, with the first prompt always being from the user, and the last always being from the assistant.
|
niv-al/sq-anli_a1 | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: labels
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 5975530
num_examples: 16946
- name: validation
num_bytes: 50063
num_examples: 144
- name: test
num_bytes: 51311
num_examples: 144
download_size: 2167104
dataset_size: 6076904
language:
- sq
---
# Dataset Card for "sq-anli_a1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nickstrain/Tamil_English_Corpus_data | ---
license: mit
---
|
silvainrichou/cortex.t_2 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 551521540
num_examples: 199984
download_size: 284809312
dataset_size: 551521540
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Lime1/Tunisian_reddit | ---
license: mit
task_categories:
- conversational
language:
- ar
- fr
- en
pretty_name: Tunisian Reddit Dataset
size_categories:
- 100K<n<1M
---
# r/Tunisia Data set
## Dataset Description
This repository contains two datasets:
1. [output_comments.csv](output_comments.csv): This file contains the comments data. Each row represents a comment, with various attributes such as the comment ID, the post it belongs to, the user who made the comment, and the comment text. (sorted by score) `id,url,score,body,date`
2. [output_posts.csv](output_posts.csv): This file contains the posts data. Each row represents a post, with various attributes such as the post ID, the user who made the post, and the post text. (sorted by date)
`id,url,score,title,body,top_comment1,top_comment2,top_comment3,top_comment4,top_comment5,date`
Data ranges from 2009-01-01 to 2022-12-31. |
AIQuest/House-Price-In_Multan | ---
license: apache-2.0
---
|
jinaai/cities_wiki_clustering | ---
language:
- en
---
# WikiCities Clustering Dataset
This dataset was created from the (Wikipedia)[https://huggingface.co/datasets/wikipedia] training dataset by using a list of countries,
retrieving all cities for each country, and then finding their corresponding Wikipedia article in the Wikipedia dataset. Postprocessing
removed the last 25th percentile of countries with fewest city articles, and also took a maximum of 200 articles per country.
The final set has a total of 126 countries, and a total of 3531 cities.
Below is a distribution of cities by country.
 |
neil-code/samsum-test | ---
configs:
- config_name: default
data_files:
- split: train
path: "./data/corpus-cut/train.csv"
- split: test
path: "./data/corpus-cut/test.csv"
- split: val
path: "./data/corpus-cut/val.csv"
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Am4nu3l/amharic-language-voices | ---
license: mit
---
|
skrishna/coin_flip_16_transformed | ---
dataset_info:
features:
- name: targets
dtype: string
- name: targets_vec
sequence: int64
- name: inputs
dtype: string
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 2137808
num_examples: 2000
- name: test
num_bytes: 2134088
num_examples: 2000
download_size: 1219652
dataset_size: 4271896
---
# Dataset Card for "coin_flip_16_transformed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_xxyyy123__test_qkvo_adptor | ---
pretty_name: Evaluation run of xxyyy123/test_qkvo_adptor
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xxyyy123/test_qkvo_adptor](https://huggingface.co/xxyyy123/test_qkvo_adptor)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xxyyy123__test_qkvo_adptor\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-03T04:28:57.572800](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__test_qkvo_adptor/blob/main/results_2023-09-03T04%3A28%3A57.572800.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5181817055404072,\n\
\ \"acc_stderr\": 0.03503876216805227,\n \"acc_norm\": 0.5217050459960606,\n\
\ \"acc_norm_stderr\": 0.0350240051469643,\n \"mc1\": 0.37576499388004897,\n\
\ \"mc1_stderr\": 0.016954584060214297,\n \"mc2\": 0.5352609389838828,\n\
\ \"mc2_stderr\": 0.015584516858746202\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5349829351535836,\n \"acc_stderr\": 0.014575583922019669,\n\
\ \"acc_norm\": 0.5537542662116041,\n \"acc_norm_stderr\": 0.014526705548539982\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6007767377016531,\n\
\ \"acc_stderr\": 0.004887378682406531,\n \"acc_norm\": 0.7898824935271859,\n\
\ \"acc_norm_stderr\": 0.004065592811695945\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.0403356566784832,\n\
\ \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.0403356566784832\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n\
\ \"acc_stderr\": 0.041711158581816184,\n \"acc_norm\": 0.5347222222222222,\n\
\ \"acc_norm_stderr\": 0.041711158581816184\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.4682080924855491,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"\
acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5645161290322581,\n\
\ \"acc_stderr\": 0.028206225591502744,\n \"acc_norm\": 0.5645161290322581,\n\
\ \"acc_norm_stderr\": 0.028206225591502744\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.03366124489051449,\n\
\ \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.03366124489051449\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.035243908445117815,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.035243908445117815\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6363636363636364,\n \"acc_stderr\": 0.03427308652999933,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03427308652999933\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845454,\n\
\ \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845454\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5282051282051282,\n \"acc_stderr\": 0.02531063925493388,\n \
\ \"acc_norm\": 0.5282051282051282,\n \"acc_norm_stderr\": 0.02531063925493388\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959916,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959916\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.032437180551374116,\n\
\ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.032437180551374116\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.037579499229433426,\n \"\
acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.037579499229433426\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7211009174311926,\n \"acc_stderr\": 0.019227468876463507,\n \"\
acc_norm\": 0.7211009174311926,\n \"acc_norm_stderr\": 0.019227468876463507\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7009803921568627,\n \"acc_stderr\": 0.03213325717373617,\n \"\
acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.03213325717373617\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7088607594936709,\n \"acc_stderr\": 0.029571601065753374,\n \
\ \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.029571601065753374\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5919282511210763,\n\
\ \"acc_stderr\": 0.03298574607842822,\n \"acc_norm\": 0.5919282511210763,\n\
\ \"acc_norm_stderr\": 0.03298574607842822\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.04243869242230524,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.04243869242230524\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884122,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884122\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n\
\ \"acc_stderr\": 0.04616631111801714,\n \"acc_norm\": 0.6481481481481481,\n\
\ \"acc_norm_stderr\": 0.04616631111801714\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5460122699386503,\n \"acc_stderr\": 0.0391170190467718,\n\
\ \"acc_norm\": 0.5460122699386503,\n \"acc_norm_stderr\": 0.0391170190467718\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.04582124160161549,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.04582124160161549\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\
\ \"acc_stderr\": 0.027778835904935437,\n \"acc_norm\": 0.7649572649572649,\n\
\ \"acc_norm_stderr\": 0.027778835904935437\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7164750957854407,\n\
\ \"acc_stderr\": 0.01611731816683227,\n \"acc_norm\": 0.7164750957854407,\n\
\ \"acc_norm_stderr\": 0.01611731816683227\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.026756255129663765,\n\
\ \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.026756255129663765\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331146,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331146\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5359477124183006,\n \"acc_stderr\": 0.02855582751652878,\n\
\ \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.02855582751652878\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5755627009646302,\n\
\ \"acc_stderr\": 0.028071928247946208,\n \"acc_norm\": 0.5755627009646302,\n\
\ \"acc_norm_stderr\": 0.028071928247946208\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5524691358024691,\n \"acc_stderr\": 0.027667138569422708,\n\
\ \"acc_norm\": 0.5524691358024691,\n \"acc_norm_stderr\": 0.027667138569422708\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596143,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596143\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3767926988265971,\n\
\ \"acc_stderr\": 0.012376459593894398,\n \"acc_norm\": 0.3767926988265971,\n\
\ \"acc_norm_stderr\": 0.012376459593894398\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275668,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275668\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.020227834851568375,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.020227834851568375\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.04724577405731572,\n\
\ \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.04724577405731572\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.636734693877551,\n\
\ \"acc_stderr\": 0.030789051139030806,\n \"acc_norm\": 0.636734693877551,\n\
\ \"acc_norm_stderr\": 0.030789051139030806\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.582089552238806,\n \"acc_stderr\": 0.034875586404620636,\n\
\ \"acc_norm\": 0.582089552238806,\n \"acc_norm_stderr\": 0.034875586404620636\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.3855421686746988,\n \"acc_stderr\": 0.0378913442461155,\n\
\ \"acc_norm\": 0.3855421686746988,\n \"acc_norm_stderr\": 0.0378913442461155\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7134502923976608,\n\
\ \"acc_stderr\": 0.034678266857038266,\n \"acc_norm\": 0.7134502923976608,\n\
\ \"acc_norm_stderr\": 0.034678266857038266\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.37576499388004897,\n \"mc1_stderr\": 0.016954584060214297,\n\
\ \"mc2\": 0.5352609389838828,\n \"mc2_stderr\": 0.015584516858746202\n\
\ }\n}\n```"
repo_url: https://huggingface.co/xxyyy123/test_qkvo_adptor
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|arc:challenge|25_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hellaswag|10_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T04:28:57.572800.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T04:28:57.572800.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T04:28:57.572800.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T04:28:57.572800.parquet'
- config_name: results
data_files:
- split: 2023_09_03T04_28_57.572800
path:
- results_2023-09-03T04:28:57.572800.parquet
- split: latest
path:
- results_2023-09-03T04:28:57.572800.parquet
---
# Dataset Card for Evaluation run of xxyyy123/test_qkvo_adptor
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xxyyy123/test_qkvo_adptor
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xxyyy123/test_qkvo_adptor](https://huggingface.co/xxyyy123/test_qkvo_adptor) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xxyyy123__test_qkvo_adptor",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-03T04:28:57.572800](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__test_qkvo_adptor/blob/main/results_2023-09-03T04%3A28%3A57.572800.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5181817055404072,
"acc_stderr": 0.03503876216805227,
"acc_norm": 0.5217050459960606,
"acc_norm_stderr": 0.0350240051469643,
"mc1": 0.37576499388004897,
"mc1_stderr": 0.016954584060214297,
"mc2": 0.5352609389838828,
"mc2_stderr": 0.015584516858746202
},
"harness|arc:challenge|25": {
"acc": 0.5349829351535836,
"acc_stderr": 0.014575583922019669,
"acc_norm": 0.5537542662116041,
"acc_norm_stderr": 0.014526705548539982
},
"harness|hellaswag|10": {
"acc": 0.6007767377016531,
"acc_stderr": 0.004887378682406531,
"acc_norm": 0.7898824935271859,
"acc_norm_stderr": 0.004065592811695945
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4342105263157895,
"acc_stderr": 0.0403356566784832,
"acc_norm": 0.4342105263157895,
"acc_norm_stderr": 0.0403356566784832
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.041711158581816184,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.041711158581816184
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.043391383225798615,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.043391383225798615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.32275132275132273,
"acc_stderr": 0.024078943243597016,
"acc_norm": 0.32275132275132273,
"acc_norm_stderr": 0.024078943243597016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5645161290322581,
"acc_stderr": 0.028206225591502744,
"acc_norm": 0.5645161290322581,
"acc_norm_stderr": 0.028206225591502744
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.03366124489051449,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.03366124489051449
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.035243908445117815,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.035243908445117815
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03427308652999933,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03427308652999933
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.030975436386845454,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.030975436386845454
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5282051282051282,
"acc_stderr": 0.02531063925493388,
"acc_norm": 0.5282051282051282,
"acc_norm_stderr": 0.02531063925493388
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959916,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959916
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.032437180551374116,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.032437180551374116
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.037579499229433426,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.037579499229433426
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7211009174311926,
"acc_stderr": 0.019227468876463507,
"acc_norm": 0.7211009174311926,
"acc_norm_stderr": 0.019227468876463507
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.03213325717373617,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.03213325717373617
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.029571601065753374,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.029571601065753374
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5919282511210763,
"acc_stderr": 0.03298574607842822,
"acc_norm": 0.5919282511210763,
"acc_norm_stderr": 0.03298574607842822
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.04243869242230524,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.04243869242230524
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884122,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884122
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.04616631111801714,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.04616631111801714
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5460122699386503,
"acc_stderr": 0.0391170190467718,
"acc_norm": 0.5460122699386503,
"acc_norm_stderr": 0.0391170190467718
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.04582124160161549,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.04582124160161549
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.027778835904935437,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.027778835904935437
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7164750957854407,
"acc_stderr": 0.01611731816683227,
"acc_norm": 0.7164750957854407,
"acc_norm_stderr": 0.01611731816683227
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.026756255129663765,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.026756255129663765
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331146,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331146
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5359477124183006,
"acc_stderr": 0.02855582751652878,
"acc_norm": 0.5359477124183006,
"acc_norm_stderr": 0.02855582751652878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5755627009646302,
"acc_stderr": 0.028071928247946208,
"acc_norm": 0.5755627009646302,
"acc_norm_stderr": 0.028071928247946208
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5524691358024691,
"acc_stderr": 0.027667138569422708,
"acc_norm": 0.5524691358024691,
"acc_norm_stderr": 0.027667138569422708
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596143,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596143
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3767926988265971,
"acc_stderr": 0.012376459593894398,
"acc_norm": 0.3767926988265971,
"acc_norm_stderr": 0.012376459593894398
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.030365446477275668,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.030365446477275668
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5,
"acc_stderr": 0.020227834851568375,
"acc_norm": 0.5,
"acc_norm_stderr": 0.020227834851568375
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.582089552238806,
"acc_stderr": 0.034875586404620636,
"acc_norm": 0.582089552238806,
"acc_norm_stderr": 0.034875586404620636
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.0378913442461155,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.0378913442461155
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37576499388004897,
"mc1_stderr": 0.016954584060214297,
"mc2": 0.5352609389838828,
"mc2_stderr": 0.015584516858746202
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Suchinthana/Databricks-Dolly-15k-si-en-mix | ---
license: cc-by-sa-3.0
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 41110595
num_examples: 30022
download_size: 20098720
dataset_size: 41110595
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- si
- en
--- |
johannes-garstenauer/pooling_net_embeddings_dim_16_masked_dataset_1p | ---
dataset_info:
features:
- name: last_hs
sequence: float32
- name: label
dtype: int64
splits:
- name: train
num_bytes: 51148
num_examples: 673
download_size: 61004
dataset_size: 51148
---
# Dataset Card for "pooling_net_embeddings_dim_16_masked_dataset_1p"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_AA051611__A0110 | ---
pretty_name: Evaluation run of AA051611/A0110
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051611/A0110](https://huggingface.co/AA051611/A0110) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051611__A0110\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-11T07:22:47.294306](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__A0110/blob/main/results_2024-01-11T07-22-47.294306.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7412573048084544,\n\
\ \"acc_stderr\": 0.028863410665461813,\n \"acc_norm\": 0.745188483421092,\n\
\ \"acc_norm_stderr\": 0.029413193311414305,\n \"mc1\": 0.397796817625459,\n\
\ \"mc1_stderr\": 0.017133934248559635,\n \"mc2\": 0.5860453215820967,\n\
\ \"mc2_stderr\": 0.015209324923113767\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6356655290102389,\n \"acc_stderr\": 0.014063260279882417,\n\
\ \"acc_norm\": 0.6638225255972696,\n \"acc_norm_stderr\": 0.013804855026205761\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6546504680342561,\n\
\ \"acc_stderr\": 0.0047451035439012934,\n \"acc_norm\": 0.8473411670981876,\n\
\ \"acc_norm_stderr\": 0.003589232889306521\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n\
\ \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.7185185185185186,\n\
\ \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.868421052631579,\n \"acc_stderr\": 0.027508689533549905,\n\
\ \"acc_norm\": 0.868421052631579,\n \"acc_norm_stderr\": 0.027508689533549905\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7924528301886793,\n \"acc_stderr\": 0.024959918028911274,\n\
\ \"acc_norm\": 0.7924528301886793,\n \"acc_norm_stderr\": 0.024959918028911274\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7109826589595376,\n\
\ \"acc_stderr\": 0.03456425745086999,\n \"acc_norm\": 0.7109826589595376,\n\
\ \"acc_norm_stderr\": 0.03456425745086999\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.04966570903978529,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.04966570903978529\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7617021276595745,\n \"acc_stderr\": 0.027851252973889788,\n\
\ \"acc_norm\": 0.7617021276595745,\n \"acc_norm_stderr\": 0.027851252973889788\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5789473684210527,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.5789473684210527,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7310344827586207,\n \"acc_stderr\": 0.036951833116502325,\n\
\ \"acc_norm\": 0.7310344827586207,\n \"acc_norm_stderr\": 0.036951833116502325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.626984126984127,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\"\
: 0.626984126984127,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8774193548387097,\n\
\ \"acc_stderr\": 0.018656720991789406,\n \"acc_norm\": 0.8774193548387097,\n\
\ \"acc_norm_stderr\": 0.018656720991789406\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.03465304488406795,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.03465304488406795\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\
: 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284357,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284357\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8888888888888888,\n \"acc_stderr\": 0.02239078763821677,\n \"\
acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.02239078763821677\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9637305699481865,\n \"acc_stderr\": 0.013492659751295145,\n\
\ \"acc_norm\": 0.9637305699481865,\n \"acc_norm_stderr\": 0.013492659751295145\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8,\n \"acc_stderr\": 0.020280805062535722,\n \"acc_norm\"\
: 0.8,\n \"acc_norm_stderr\": 0.020280805062535722\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02995824925008211,\n\
\ \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02995824925008211\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8403361344537815,\n \"acc_stderr\": 0.0237933539975288,\n \
\ \"acc_norm\": 0.8403361344537815,\n \"acc_norm_stderr\": 0.0237933539975288\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.45695364238410596,\n \"acc_stderr\": 0.04067325174247443,\n \"\
acc_norm\": 0.45695364238410596,\n \"acc_norm_stderr\": 0.04067325174247443\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9064220183486239,\n \"acc_stderr\": 0.012486841824601963,\n \"\
acc_norm\": 0.9064220183486239,\n \"acc_norm_stderr\": 0.012486841824601963\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.0321495214780275,\n \"acc_norm\"\
: 0.6666666666666666,\n \"acc_norm_stderr\": 0.0321495214780275\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n\
\ \"acc_stderr\": 0.019907399791316942,\n \"acc_norm\": 0.9117647058823529,\n\
\ \"acc_norm_stderr\": 0.019907399791316942\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065515,\n\
\ \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065515\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7713004484304933,\n\
\ \"acc_stderr\": 0.028188240046929203,\n \"acc_norm\": 0.7713004484304933,\n\
\ \"acc_norm_stderr\": 0.028188240046929203\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597453,\n\
\ \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597453\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9008264462809917,\n \"acc_stderr\": 0.02728524631275896,\n \"\
acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.02728524631275896\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.030381596756651655,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.030381596756651655\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8404907975460123,\n \"acc_stderr\": 0.028767481725983878,\n\
\ \"acc_norm\": 0.8404907975460123,\n \"acc_norm_stderr\": 0.028767481725983878\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5892857142857143,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.5892857142857143,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.912621359223301,\n \"acc_stderr\": 0.027960689125970654,\n\
\ \"acc_norm\": 0.912621359223301,\n \"acc_norm_stderr\": 0.027960689125970654\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n\
\ \"acc_stderr\": 0.01604626163167314,\n \"acc_norm\": 0.9358974358974359,\n\
\ \"acc_norm_stderr\": 0.01604626163167314\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9042145593869731,\n\
\ \"acc_stderr\": 0.010524031079055826,\n \"acc_norm\": 0.9042145593869731,\n\
\ \"acc_norm_stderr\": 0.010524031079055826\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8005780346820809,\n \"acc_stderr\": 0.021511900654252555,\n\
\ \"acc_norm\": 0.8005780346820809,\n \"acc_norm_stderr\": 0.021511900654252555\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6189944134078212,\n\
\ \"acc_stderr\": 0.016242028834053613,\n \"acc_norm\": 0.6189944134078212,\n\
\ \"acc_norm_stderr\": 0.016242028834053613\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8071895424836601,\n \"acc_stderr\": 0.02258931888817669,\n\
\ \"acc_norm\": 0.8071895424836601,\n \"acc_norm_stderr\": 0.02258931888817669\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8295819935691319,\n\
\ \"acc_stderr\": 0.021355343028264043,\n \"acc_norm\": 0.8295819935691319,\n\
\ \"acc_norm_stderr\": 0.021355343028264043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8302469135802469,\n \"acc_stderr\": 0.02088869041409387,\n\
\ \"acc_norm\": 0.8302469135802469,\n \"acc_norm_stderr\": 0.02088869041409387\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6063829787234043,\n \"acc_stderr\": 0.02914454478159616,\n \
\ \"acc_norm\": 0.6063829787234043,\n \"acc_norm_stderr\": 0.02914454478159616\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5573663624511083,\n\
\ \"acc_stderr\": 0.012685906538206237,\n \"acc_norm\": 0.5573663624511083,\n\
\ \"acc_norm_stderr\": 0.012685906538206237\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.023157468308559345,\n\
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.023157468308559345\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7924836601307189,\n \"acc_stderr\": 0.016405924270103234,\n \
\ \"acc_norm\": 0.7924836601307189,\n \"acc_norm_stderr\": 0.016405924270103234\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8081632653061225,\n \"acc_stderr\": 0.0252069631542254,\n\
\ \"acc_norm\": 0.8081632653061225,\n \"acc_norm_stderr\": 0.0252069631542254\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n\
\ \"acc_stderr\": 0.021166216304659393,\n \"acc_norm\": 0.900497512437811,\n\
\ \"acc_norm_stderr\": 0.021166216304659393\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.94,\n \"acc_stderr\": 0.023868325657594173,\n \
\ \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.023868325657594173\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.02353755765789256,\n\
\ \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.02353755765789256\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.397796817625459,\n\
\ \"mc1_stderr\": 0.017133934248559635,\n \"mc2\": 0.5860453215820967,\n\
\ \"mc2_stderr\": 0.015209324923113767\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918739\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6482183472327521,\n \
\ \"acc_stderr\": 0.013153446023536039\n }\n}\n```"
repo_url: https://huggingface.co/AA051611/A0110
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|arc:challenge|25_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|gsm8k|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hellaswag|10_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T07-22-47.294306.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-11T07-22-47.294306.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- '**/details_harness|winogrande|5_2024-01-11T07-22-47.294306.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-11T07-22-47.294306.parquet'
- config_name: results
data_files:
- split: 2024_01_11T07_22_47.294306
path:
- results_2024-01-11T07-22-47.294306.parquet
- split: latest
path:
- results_2024-01-11T07-22-47.294306.parquet
---
# Dataset Card for Evaluation run of AA051611/A0110
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051611/A0110](https://huggingface.co/AA051611/A0110) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051611__A0110",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-11T07:22:47.294306](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__A0110/blob/main/results_2024-01-11T07-22-47.294306.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7412573048084544,
"acc_stderr": 0.028863410665461813,
"acc_norm": 0.745188483421092,
"acc_norm_stderr": 0.029413193311414305,
"mc1": 0.397796817625459,
"mc1_stderr": 0.017133934248559635,
"mc2": 0.5860453215820967,
"mc2_stderr": 0.015209324923113767
},
"harness|arc:challenge|25": {
"acc": 0.6356655290102389,
"acc_stderr": 0.014063260279882417,
"acc_norm": 0.6638225255972696,
"acc_norm_stderr": 0.013804855026205761
},
"harness|hellaswag|10": {
"acc": 0.6546504680342561,
"acc_stderr": 0.0047451035439012934,
"acc_norm": 0.8473411670981876,
"acc_norm_stderr": 0.003589232889306521
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.868421052631579,
"acc_stderr": 0.027508689533549905,
"acc_norm": 0.868421052631579,
"acc_norm_stderr": 0.027508689533549905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7924528301886793,
"acc_stderr": 0.024959918028911274,
"acc_norm": 0.7924528301886793,
"acc_norm_stderr": 0.024959918028911274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.03456425745086999,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.03456425745086999
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.04966570903978529,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.04966570903978529
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7617021276595745,
"acc_stderr": 0.027851252973889788,
"acc_norm": 0.7617021276595745,
"acc_norm_stderr": 0.027851252973889788
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7310344827586207,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.7310344827586207,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.626984126984127,
"acc_stderr": 0.02490699045899257,
"acc_norm": 0.626984126984127,
"acc_norm_stderr": 0.02490699045899257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8774193548387097,
"acc_stderr": 0.018656720991789406,
"acc_norm": 0.8774193548387097,
"acc_norm_stderr": 0.018656720991789406
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.03465304488406795,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.03465304488406795
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.028450388805284357,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.028450388805284357
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02239078763821677,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02239078763821677
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9637305699481865,
"acc_stderr": 0.013492659751295145,
"acc_norm": 0.9637305699481865,
"acc_norm_stderr": 0.013492659751295145
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8,
"acc_stderr": 0.020280805062535722,
"acc_norm": 0.8,
"acc_norm_stderr": 0.020280805062535722
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02995824925008211,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02995824925008211
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8403361344537815,
"acc_stderr": 0.0237933539975288,
"acc_norm": 0.8403361344537815,
"acc_norm_stderr": 0.0237933539975288
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.45695364238410596,
"acc_stderr": 0.04067325174247443,
"acc_norm": 0.45695364238410596,
"acc_norm_stderr": 0.04067325174247443
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9064220183486239,
"acc_stderr": 0.012486841824601963,
"acc_norm": 0.9064220183486239,
"acc_norm_stderr": 0.012486841824601963
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0321495214780275,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0321495214780275
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316942,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316942
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065515,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065515
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7713004484304933,
"acc_stderr": 0.028188240046929203,
"acc_norm": 0.7713004484304933,
"acc_norm_stderr": 0.028188240046929203
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597453,
"acc_norm": 0.8854961832061069,
"acc_norm_stderr": 0.027927473753597453
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.02728524631275896,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.02728524631275896
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.030381596756651655,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.030381596756651655
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8404907975460123,
"acc_stderr": 0.028767481725983878,
"acc_norm": 0.8404907975460123,
"acc_norm_stderr": 0.028767481725983878
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5892857142857143,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.5892857142857143,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.912621359223301,
"acc_stderr": 0.027960689125970654,
"acc_norm": 0.912621359223301,
"acc_norm_stderr": 0.027960689125970654
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.01604626163167314,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.01604626163167314
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9042145593869731,
"acc_stderr": 0.010524031079055826,
"acc_norm": 0.9042145593869731,
"acc_norm_stderr": 0.010524031079055826
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8005780346820809,
"acc_stderr": 0.021511900654252555,
"acc_norm": 0.8005780346820809,
"acc_norm_stderr": 0.021511900654252555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6189944134078212,
"acc_stderr": 0.016242028834053613,
"acc_norm": 0.6189944134078212,
"acc_norm_stderr": 0.016242028834053613
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8071895424836601,
"acc_stderr": 0.02258931888817669,
"acc_norm": 0.8071895424836601,
"acc_norm_stderr": 0.02258931888817669
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8295819935691319,
"acc_stderr": 0.021355343028264043,
"acc_norm": 0.8295819935691319,
"acc_norm_stderr": 0.021355343028264043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8302469135802469,
"acc_stderr": 0.02088869041409387,
"acc_norm": 0.8302469135802469,
"acc_norm_stderr": 0.02088869041409387
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6063829787234043,
"acc_stderr": 0.02914454478159616,
"acc_norm": 0.6063829787234043,
"acc_norm_stderr": 0.02914454478159616
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5573663624511083,
"acc_stderr": 0.012685906538206237,
"acc_norm": 0.5573663624511083,
"acc_norm_stderr": 0.012685906538206237
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.023157468308559345,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.023157468308559345
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7924836601307189,
"acc_stderr": 0.016405924270103234,
"acc_norm": 0.7924836601307189,
"acc_norm_stderr": 0.016405924270103234
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8081632653061225,
"acc_stderr": 0.0252069631542254,
"acc_norm": 0.8081632653061225,
"acc_norm_stderr": 0.0252069631542254
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659393,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659393
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.94,
"acc_stderr": 0.023868325657594173,
"acc_norm": 0.94,
"acc_norm_stderr": 0.023868325657594173
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8947368421052632,
"acc_stderr": 0.02353755765789256,
"acc_norm": 0.8947368421052632,
"acc_norm_stderr": 0.02353755765789256
},
"harness|truthfulqa:mc|0": {
"mc1": 0.397796817625459,
"mc1_stderr": 0.017133934248559635,
"mc2": 0.5860453215820967,
"mc2_stderr": 0.015209324923113767
},
"harness|winogrande|5": {
"acc": 0.8232044198895028,
"acc_stderr": 0.010721923287918739
},
"harness|gsm8k|5": {
"acc": 0.6482183472327521,
"acc_stderr": 0.013153446023536039
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
pivoboy/DeusReiDarius | ---
license: openrail
---
|
dltjdgh0928/klue_mrc | ---
license: apache-2.0
---
|
alvarobartt/instruction-dataset-mistral-7b-instruct-v0.2 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: generation
dtype: string
- name: model_name
dtype: string
splits:
- name: train
num_bytes: 244298
num_examples: 327
download_size: 155520
dataset_size: 244298
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
task_categories:
- text-generation
language:
- en
tags:
- synthetic
size_categories:
- n<1K
source_datasets:
- HuggingFaceH4/instruction-dataset
---
# HuggingFaceH4/instruction-dataset but generated with Mistral-7B-Instruct-v0.2
This dataset has been generated with [`distilabel v1.0.0.b0`](https://github.com/argilla-io/distilabel/tree/4fb29c262ba5dc5c97ed6497bc2b49a41060492c)
using [`vLLM`](https://github.com/vllm-project/vllm) and running [`mistralai/Mistral-7B-Instruct-v0.2`](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2),
using the `prompt` column from the dataset [`HuggingFaceH4/instruction-dataset`](https://huggingface.co/datasets/HuggingFaceH4/instruction-dataset). |
David99YY/instruct_generation_data | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: int64
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 118902
num_examples: 500
- name: validation
num_bytes: 142187
num_examples: 500
download_size: 132225
dataset_size: 261089
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
CyberHarem/collon_sukasuka | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Collon Rin Purgatrio/コロン・リン・プルガトリオ (Shuumatsu Nani Shitemasu Ka? Isogashii Desu Ka?)
This is the dataset of Collon Rin Purgatrio/コロン・リン・プルガトリオ (Shuumatsu Nani Shitemasu Ka? Isogashii Desu Ka?), containing 54 images and their tags.
The core tags of this character are `pink_hair, long_hair, pink_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 54 | 29.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/collon_sukasuka/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 54 | 29.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/collon_sukasuka/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 82 | 49.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/collon_sukasuka/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/collon_sukasuka',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, open_mouth, shorts, solo, :d, shirt |
| 1 | 6 |  |  |  |  |  | 2girls, purple_hair, red_eyes, solo_focus, 1girl, open_mouth, smile |
| 2 | 6 |  |  |  |  |  | 3girls, broom, green_hair, grin, mop, red_eyes, short_hair, wooden_floor |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | open_mouth | shorts | solo | :d | shirt | 2girls | purple_hair | red_eyes | solo_focus | smile | 3girls | broom | green_hair | grin | mop | short_hair | wooden_floor |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:---------|:-------|:-----|:--------|:---------|:--------------|:-----------|:-------------|:--------|:---------|:--------|:-------------|:-------|:------|:-------------|:---------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | | | | | X | X | X | X | X | | | | | | | |
| 2 | 6 |  |  |  |  |  | | | | | | | | | X | | | X | X | X | X | X | X | X |
|
taesiri/arxiv_thumbnails | ---
license: cc-by-4.0
---
|
llmixer/20k_random_data | ---
tags:
- 20k
- random
---
Dataset created from the 20k_random_data.txt file linked by [kalomaze](https://github.com/kalomaze) here: [https://github.com/ggerganov/llama.cpp/discussions/5006](https://github.com/ggerganov/llama.cpp/discussions/5006#discussioncomment-8163190) |
christykoh/ag_news_ar | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': World
'1': Sports
'2': Business
'3': Sci/Tech
splits:
- name: train
num_bytes: 41887724
num_examples: 120000
- name: test
num_bytes: 2650172
num_examples: 7600
download_size: 23151161
dataset_size: 44537896
---
# Dataset Card for "ag_news_ar"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huggingface/autotrain-data-testing | Invalid username or password. |
hardikch05/sql_to_text | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 1402
num_examples: 1
download_size: 10620
dataset_size: 1402
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
chronbmm/vedic-dependency-parsing | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: unsandhied
dtype: string
splits:
- name: train
num_bytes: 4155754
num_examples: 7178
- name: validation
num_bytes: 198491
num_examples: 330
- name: test
num_bytes: 196230
num_examples: 340
download_size: 2351596
dataset_size: 4550475
---
# Dataset Card for "vedic-dependency-parsing"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
male-2/training_v0.0.4-public | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 276
num_examples: 1
download_size: 2824
dataset_size: 276
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "training_v0.0.4-public"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/kamikaze_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kamikaze/神風/神风 (Azur Lane)
This is the dataset of kamikaze/神風/神风 (Azur Lane), containing 36 images and their tags.
The core tags of this character are `animal_ears, fox_ears, long_hair, multicolored_hair, grey_hair, streaked_hair, tail, red_hair, red_eyes, bangs, ribbon, fox_tail, hair_ribbon, hair_ornament, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 36 | 42.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamikaze_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 36 | 27.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamikaze_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 78 | 54.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamikaze_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 36 | 38.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamikaze_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 78 | 70.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamikaze_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kamikaze_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, :q, wide_sleeves, closed_mouth, long_sleeves, black_skirt, simple_background, white_background, full_body, jingle_bell, white_socks, hair_bell, pleated_skirt, ribbon-trimmed_sleeves, standing, black_hakama, gohei, hakama_short_skirt, holding, kimono, petals |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | smile | solo | :q | wide_sleeves | closed_mouth | long_sleeves | black_skirt | simple_background | white_background | full_body | jingle_bell | white_socks | hair_bell | pleated_skirt | ribbon-trimmed_sleeves | standing | black_hakama | gohei | hakama_short_skirt | holding | kimono | petals |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------|:-----|:---------------|:---------------|:---------------|:--------------|:--------------------|:-------------------|:------------|:--------------|:--------------|:------------|:----------------|:-------------------------|:-----------|:---------------|:--------|:---------------------|:----------|:---------|:---------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
LangChainHub-Prompts/LLM_Bash |
---
tags:
- langchain
- prompt
---
# Description of LLM Bash
Prompt designed to convert natural language to bash command.
## Inputs
This is a description of the inputs that the prompt expects.
question: User question to be answered by writing a bash command.
## Usage
Below is a code snippet for how to use the prompt.
```
from langchain.prompts import load_prompt
from langchain.chains import LLMBashChain
llm = ...
prompt = load_prompt('lc://prompts/llm_bash/<file-name>')
chain = LLMBashChain(llm=llm, prompt=prompt)
```
|
liuyanchen1015/MULTI_VALUE_mnli_invariant_tag_non_concord | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 21882
num_examples: 152
- name: dev_mismatched
num_bytes: 32709
num_examples: 241
- name: test_matched
num_bytes: 21786
num_examples: 145
- name: test_mismatched
num_bytes: 34911
num_examples: 234
- name: train
num_bytes: 879303
num_examples: 5702
download_size: 548191
dataset_size: 990591
---
# Dataset Card for "MULTI_VALUE_mnli_invariant_tag_non_concord"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Maiconn/piuvozes | ---
license: openrail
---
|
DezS/Accent-and-accentless-Vietnamese-dataset | ---
license: pddl
This dataset consists of 2 columns: Sentences and AccentlessSentences
This is the dataset used for the training of this model: https://huggingface.co/DezS/T5-Vietnamese-Accent-Adder
---
|
alirzb/SeizureClassifier_Wav2Vec_43243531_on_UnBal_43765865 | ---
dataset_info:
features:
- name: array
sequence: float64
- name: label_true
dtype: int64
- name: label_pred
dtype: int64
splits:
- name: train
num_bytes: 3693924
num_examples: 9
download_size: 1117454
dataset_size: 3693924
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ahmadSiddiqi/tweet_sentiment_fr | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 206099
num_examples: 1839
- name: validation
num_bytes: 35976
num_examples: 324
download_size: 149995
dataset_size: 242075
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
Mim/autotrain-data-procell-expert | ---
task_categories:
- text-classification
---
# AutoTrain Dataset for project: procell-expert
## Dataset Descritpion
This dataset has been automatically processed by AutoTrain for project procell-expert.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "We studied the antitumor activity and toxicity of ZD1694 (tomudex), a specific inhibitor of thymidyl[...]",
"target": 0
},
{
"text": "Here we provide data that human prostate cancer cell lines express the platelet-type isoform of 12-L[...]",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "ClassLabel(num_classes=2, names=['accept', 'reject'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 155 |
| valid | 40 |
|
typosonlr/ChatDoctor_50percent_preprocessed | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 92323505
num_examples: 98515
download_size: 55646760
dataset_size: 92323505
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_s3nh__nsfw-noromaid-mistral-instruct | ---
pretty_name: Evaluation run of s3nh/nsfw-noromaid-mistral-instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [s3nh/nsfw-noromaid-mistral-instruct](https://huggingface.co/s3nh/nsfw-noromaid-mistral-instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_s3nh__nsfw-noromaid-mistral-instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-08T10:52:46.659107](https://huggingface.co/datasets/open-llm-leaderboard/details_s3nh__nsfw-noromaid-mistral-instruct/blob/main/results_2024-01-08T10-52-46.659107.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4636397253806667,\n\
\ \"acc_stderr\": 0.03433097238406634,\n \"acc_norm\": 0.4705183598294859,\n\
\ \"acc_norm_stderr\": 0.03515170311421716,\n \"mc1\": 0.2252141982864137,\n\
\ \"mc1_stderr\": 0.014623240768023495,\n \"mc2\": 0.3349241921724532,\n\
\ \"mc2_stderr\": 0.013441943397542705\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4786689419795222,\n \"acc_stderr\": 0.014598087973127106,\n\
\ \"acc_norm\": 0.5179180887372014,\n \"acc_norm_stderr\": 0.014602005585490976\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5367456681935869,\n\
\ \"acc_stderr\": 0.0049762883216818215,\n \"acc_norm\": 0.7539334793865764,\n\
\ \"acc_norm_stderr\": 0.004298374936365623\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490437,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490437\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205615,\n\
\ \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205615\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.039505818611799616,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.039505818611799616\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.032469569197899575,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.032469569197899575\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819074,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819074\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.45806451612903226,\n\
\ \"acc_stderr\": 0.028343787250540618,\n \"acc_norm\": 0.45806451612903226,\n\
\ \"acc_norm_stderr\": 0.028343787250540618\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.39901477832512317,\n \"acc_stderr\": 0.03445487686264716,\n\
\ \"acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.03445487686264716\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6060606060606061,\n \"acc_stderr\": 0.03481285338232963,\n \"\
acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.03481285338232963\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6683937823834197,\n \"acc_stderr\": 0.03397636541089118,\n\
\ \"acc_norm\": 0.6683937823834197,\n \"acc_norm_stderr\": 0.03397636541089118\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4641025641025641,\n \"acc_stderr\": 0.025285585990017848,\n\
\ \"acc_norm\": 0.4641025641025641,\n \"acc_norm_stderr\": 0.025285585990017848\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.46218487394957986,\n \"acc_stderr\": 0.032385469487589795,\n\
\ \"acc_norm\": 0.46218487394957986,\n \"acc_norm_stderr\": 0.032385469487589795\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5614678899082569,\n \"acc_stderr\": 0.02127471307395458,\n \"\
acc_norm\": 0.5614678899082569,\n \"acc_norm_stderr\": 0.02127471307395458\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3287037037037037,\n \"acc_stderr\": 0.03203614084670058,\n \"\
acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.03203614084670058\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.35294117647058826,\n \"acc_stderr\": 0.033540924375915195,\n \"\
acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.033540924375915195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5569620253164557,\n \"acc_stderr\": 0.03233532777533484,\n \
\ \"acc_norm\": 0.5569620253164557,\n \"acc_norm_stderr\": 0.03233532777533484\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n\
\ \"acc_stderr\": 0.03337883736255099,\n \"acc_norm\": 0.5515695067264574,\n\
\ \"acc_norm_stderr\": 0.03337883736255099\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884122,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884122\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.49074074074074076,\n\
\ \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.49074074074074076,\n\
\ \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5276073619631901,\n \"acc_stderr\": 0.0392237829061099,\n\
\ \"acc_norm\": 0.5276073619631901,\n \"acc_norm_stderr\": 0.0392237829061099\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6752136752136753,\n\
\ \"acc_stderr\": 0.03067902276549883,\n \"acc_norm\": 0.6752136752136753,\n\
\ \"acc_norm_stderr\": 0.03067902276549883\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488585,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488585\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.669220945083014,\n\
\ \"acc_stderr\": 0.016824818462563746,\n \"acc_norm\": 0.669220945083014,\n\
\ \"acc_norm_stderr\": 0.016824818462563746\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5491329479768786,\n \"acc_stderr\": 0.026788811931562757,\n\
\ \"acc_norm\": 0.5491329479768786,\n \"acc_norm_stderr\": 0.026788811931562757\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27039106145251396,\n\
\ \"acc_stderr\": 0.014854993938010083,\n \"acc_norm\": 0.27039106145251396,\n\
\ \"acc_norm_stderr\": 0.014854993938010083\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5359477124183006,\n \"acc_stderr\": 0.028555827516528784,\n\
\ \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.028555827516528784\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5530546623794212,\n\
\ \"acc_stderr\": 0.028237769422085328,\n \"acc_norm\": 0.5530546623794212,\n\
\ \"acc_norm_stderr\": 0.028237769422085328\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5339506172839507,\n \"acc_stderr\": 0.02775653525734767,\n\
\ \"acc_norm\": 0.5339506172839507,\n \"acc_norm_stderr\": 0.02775653525734767\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.38652482269503546,\n \"acc_stderr\": 0.02904919034254347,\n \
\ \"acc_norm\": 0.38652482269503546,\n \"acc_norm_stderr\": 0.02904919034254347\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2835723598435463,\n\
\ \"acc_stderr\": 0.011511900775968328,\n \"acc_norm\": 0.2835723598435463,\n\
\ \"acc_norm_stderr\": 0.011511900775968328\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.375,\n \"acc_stderr\": 0.029408372932278746,\n \
\ \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.029408372932278746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4852941176470588,\n \"acc_stderr\": 0.020219083895133924,\n \
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.020219083895133924\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5224489795918368,\n \"acc_stderr\": 0.03197694118713672,\n\
\ \"acc_norm\": 0.5224489795918368,\n \"acc_norm_stderr\": 0.03197694118713672\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.527363184079602,\n\
\ \"acc_stderr\": 0.035302355173346824,\n \"acc_norm\": 0.527363184079602,\n\
\ \"acc_norm_stderr\": 0.035302355173346824\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.672514619883041,\n \"acc_stderr\": 0.035993357714560276,\n\
\ \"acc_norm\": 0.672514619883041,\n \"acc_norm_stderr\": 0.035993357714560276\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2252141982864137,\n\
\ \"mc1_stderr\": 0.014623240768023495,\n \"mc2\": 0.3349241921724532,\n\
\ \"mc2_stderr\": 0.013441943397542705\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7119179163378059,\n \"acc_stderr\": 0.012727884724248115\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06595905989385899,\n \
\ \"acc_stderr\": 0.006836951192034228\n }\n}\n```"
repo_url: https://huggingface.co/s3nh/nsfw-noromaid-mistral-instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|arc:challenge|25_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|gsm8k|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hellaswag|10_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T10-52-46.659107.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T10-52-46.659107.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- '**/details_harness|winogrande|5_2024-01-08T10-52-46.659107.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-08T10-52-46.659107.parquet'
- config_name: results
data_files:
- split: 2024_01_08T10_52_46.659107
path:
- results_2024-01-08T10-52-46.659107.parquet
- split: latest
path:
- results_2024-01-08T10-52-46.659107.parquet
---
# Dataset Card for Evaluation run of s3nh/nsfw-noromaid-mistral-instruct
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [s3nh/nsfw-noromaid-mistral-instruct](https://huggingface.co/s3nh/nsfw-noromaid-mistral-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_s3nh__nsfw-noromaid-mistral-instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T10:52:46.659107](https://huggingface.co/datasets/open-llm-leaderboard/details_s3nh__nsfw-noromaid-mistral-instruct/blob/main/results_2024-01-08T10-52-46.659107.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4636397253806667,
"acc_stderr": 0.03433097238406634,
"acc_norm": 0.4705183598294859,
"acc_norm_stderr": 0.03515170311421716,
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023495,
"mc2": 0.3349241921724532,
"mc2_stderr": 0.013441943397542705
},
"harness|arc:challenge|25": {
"acc": 0.4786689419795222,
"acc_stderr": 0.014598087973127106,
"acc_norm": 0.5179180887372014,
"acc_norm_stderr": 0.014602005585490976
},
"harness|hellaswag|10": {
"acc": 0.5367456681935869,
"acc_stderr": 0.0049762883216818215,
"acc_norm": 0.7539334793865764,
"acc_norm_stderr": 0.004298374936365623
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490437,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490437
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5245283018867924,
"acc_stderr": 0.030735822206205615,
"acc_norm": 0.5245283018867924,
"acc_norm_stderr": 0.030735822206205615
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.041553199555931467,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.041553199555931467
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.039505818611799616,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.039505818611799616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819074,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819074
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.45806451612903226,
"acc_stderr": 0.028343787250540618,
"acc_norm": 0.45806451612903226,
"acc_norm_stderr": 0.028343787250540618
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39901477832512317,
"acc_stderr": 0.03445487686264716,
"acc_norm": 0.39901477832512317,
"acc_norm_stderr": 0.03445487686264716
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.03481285338232963,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.03481285338232963
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6683937823834197,
"acc_stderr": 0.03397636541089118,
"acc_norm": 0.6683937823834197,
"acc_norm_stderr": 0.03397636541089118
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4641025641025641,
"acc_stderr": 0.025285585990017848,
"acc_norm": 0.4641025641025641,
"acc_norm_stderr": 0.025285585990017848
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.46218487394957986,
"acc_stderr": 0.032385469487589795,
"acc_norm": 0.46218487394957986,
"acc_norm_stderr": 0.032385469487589795
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5614678899082569,
"acc_stderr": 0.02127471307395458,
"acc_norm": 0.5614678899082569,
"acc_norm_stderr": 0.02127471307395458
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3287037037037037,
"acc_stderr": 0.03203614084670058,
"acc_norm": 0.3287037037037037,
"acc_norm_stderr": 0.03203614084670058
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.033540924375915195,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.033540924375915195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5569620253164557,
"acc_stderr": 0.03233532777533484,
"acc_norm": 0.5569620253164557,
"acc_norm_stderr": 0.03233532777533484
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5515695067264574,
"acc_stderr": 0.03337883736255099,
"acc_norm": 0.5515695067264574,
"acc_norm_stderr": 0.03337883736255099
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884122,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884122
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5276073619631901,
"acc_stderr": 0.0392237829061099,
"acc_norm": 0.5276073619631901,
"acc_norm_stderr": 0.0392237829061099
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6752136752136753,
"acc_stderr": 0.03067902276549883,
"acc_norm": 0.6752136752136753,
"acc_norm_stderr": 0.03067902276549883
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488585,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488585
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.669220945083014,
"acc_stderr": 0.016824818462563746,
"acc_norm": 0.669220945083014,
"acc_norm_stderr": 0.016824818462563746
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.026788811931562757,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.026788811931562757
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27039106145251396,
"acc_stderr": 0.014854993938010083,
"acc_norm": 0.27039106145251396,
"acc_norm_stderr": 0.014854993938010083
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5359477124183006,
"acc_stderr": 0.028555827516528784,
"acc_norm": 0.5359477124183006,
"acc_norm_stderr": 0.028555827516528784
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5530546623794212,
"acc_stderr": 0.028237769422085328,
"acc_norm": 0.5530546623794212,
"acc_norm_stderr": 0.028237769422085328
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5339506172839507,
"acc_stderr": 0.02775653525734767,
"acc_norm": 0.5339506172839507,
"acc_norm_stderr": 0.02775653525734767
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.38652482269503546,
"acc_stderr": 0.02904919034254347,
"acc_norm": 0.38652482269503546,
"acc_norm_stderr": 0.02904919034254347
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2835723598435463,
"acc_stderr": 0.011511900775968328,
"acc_norm": 0.2835723598435463,
"acc_norm_stderr": 0.011511900775968328
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.375,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.375,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.020219083895133924,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.020219083895133924
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5224489795918368,
"acc_stderr": 0.03197694118713672,
"acc_norm": 0.5224489795918368,
"acc_norm_stderr": 0.03197694118713672
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.527363184079602,
"acc_stderr": 0.035302355173346824,
"acc_norm": 0.527363184079602,
"acc_norm_stderr": 0.035302355173346824
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.672514619883041,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.672514619883041,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023495,
"mc2": 0.3349241921724532,
"mc2_stderr": 0.013441943397542705
},
"harness|winogrande|5": {
"acc": 0.7119179163378059,
"acc_stderr": 0.012727884724248115
},
"harness|gsm8k|5": {
"acc": 0.06595905989385899,
"acc_stderr": 0.006836951192034228
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
quocanh34/soict_train_synthesis_dataset | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: sentence_norm
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 4941296103
num_examples: 9807
download_size: 1167222835
dataset_size: 4941296103
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "soict_train_synthesis_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/20_Hours_Sichuan_Dialect_Speech_Synthesis_Corpus_Female | ---
license: cc-by-nc-nd-4.0
---
## Description
20 Hours - Sichuan Dialect Speech Synthesis Corpus - Female. It is recorded by Chengdu Sichuan Pronunciation. the phonemes and tones are balanced. Professional phonetician participates in the annotation. It precisely matches with the research and development needs of the speech synthesis.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1327?source=Huggingface
## Format
48,000Hz, 24bit, uncompressed wav, mono channel;
## Recording environment
professional recording studio;
## Recording content
general corpus
## Speaker
professional Character Voice, female, 20-30 years old
## Device
microphone;
## Language
sichuan dialect
## Annotation
word and phoneme transcription, prosodic boundary annotation;
## Application scenarios
speech synthesis.
# Licensing Information
Commercial License
|
CyberHarem/akane_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of akane/室笠アカネ/朱音 (Blue Archive)
This is the dataset of akane/室笠アカネ/朱音 (Blue Archive), containing 500 images and their tags.
The core tags of this character are `long_hair, breasts, large_breasts, halo, glasses, hair_between_eyes, brown_eyes, light_brown_hair, animal_ears, bow, fake_animal_ears, rabbit_ears, black-framed_eyewear, blue_bow, brown_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 892.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akane_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 751.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akane_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1262 | 1.48 GiB | [Download](https://huggingface.co/datasets/CyberHarem/akane_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/akane_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 28 |  |  |  |  |  | bare_shoulders, hair_over_one_eye, official_alternate_costume, playboy_bunny, strapless_leotard, blue_leotard, detached_collar, looking_at_viewer, white_gloves, cleavage, blue_bowtie, mole_on_breast, blue_eyes, blush, traditional_bowtie, grin, hair_ribbon, thighband_pantyhose, very_long_hair, 1girl, asymmetrical_bangs, black_pantyhose, highleg_leotard, sideboob, solo, 2girls, blue_ribbon |
| 1 | 11 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, detached_collar, looking_at_viewer, official_alternate_costume, playboy_bunny, simple_background, solo, strapless_leotard, white_gloves, white_leotard, white_pantyhose, aqua_bowtie, cowboy_shot, white_background, smile, parted_lips, shawl, very_long_hair, blush, scarf, blue_bowtie |
| 2 | 8 |  |  |  |  |  | 1boy, 1girl, blush, hetero, official_alternate_costume, paizuri, playboy_bunny, smile, solo_focus, white_gloves, bowtie, detached_collar, looking_at_viewer, nipples, penis, breasts_squeezed_together, mosaic_censoring, pov, sweat, closed_mouth, cum_on_breasts, white_leotard |
| 3 | 17 |  |  |  |  |  | 1girl, blush, official_alternate_costume, playboy_bunny, 1boy, detached_collar, hetero, solo_focus, torn_pantyhose, white_leotard, white_pantyhose, leotard_aside, nipples, strapless_leotard, penis, vaginal, spread_legs, white_gloves, sweat, looking_at_viewer, open_mouth, aqua_bowtie, clothed_sex, bar_censor, girl_on_top, straddling, breasts_out, cum_in_pussy, huge_breasts, lying |
| 4 | 6 |  |  |  |  |  | maid_headdress, white_gloves, black_dress, blue_eyes, frills, looking_at_viewer, puffy_short_sleeves, ribbon, very_long_hair, white_thighhighs, 1girl, blush, bowtie, cleavage, elbow_gloves, grin, white_apron, asymmetrical_bangs, garter_straps, hair_over_one_eye, holding, maid_apron, mole_on_breast, multiple_girls, solo |
| 5 | 22 |  |  |  |  |  | long_sleeves, maid_headdress, 1girl, solo, white_apron, white_gloves, looking_at_viewer, smile, blue_necktie, holding, shawl, blush, closed_mouth, frilled_apron, maid_apron, simple_background, white_background |
| 6 | 18 |  |  |  |  |  | smile, collared_shirt, looking_at_viewer, school_uniform, white_shirt, pleated_skirt, blue_bowtie, blush, 1girl, black_skirt, simple_background, solo, long_sleeves, very_long_hair, black_footwear, black_socks, asymmetrical_bangs, blue_eyes, hair_over_one_eye, hair_ribbon, kneehighs, shoes, white_background, 2girls, full_body, standing |
| 7 | 14 |  |  |  |  |  | looking_at_viewer, alternate_costume, blush, cleavage, navel, stomach, solo_focus, grin, sideboob, white_bikini, bare_shoulders, blue_eyes, hair_over_one_eye, very_long_hair, underboob, 2girls, asymmetrical_bangs, collarbone, dark_skin, hair_ribbon, mole_on_breast, outdoors, sky |
| 8 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, navel, nipples, smile, solo, completely_nude, pussy, blush, collarbone, sweat, female_pubic_hair, huge_breasts, lying, mosaic_censoring, spread_legs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | bare_shoulders | hair_over_one_eye | official_alternate_costume | playboy_bunny | strapless_leotard | blue_leotard | detached_collar | looking_at_viewer | white_gloves | cleavage | blue_bowtie | mole_on_breast | blue_eyes | blush | traditional_bowtie | grin | hair_ribbon | thighband_pantyhose | very_long_hair | 1girl | asymmetrical_bangs | black_pantyhose | highleg_leotard | sideboob | solo | 2girls | blue_ribbon | simple_background | white_leotard | white_pantyhose | aqua_bowtie | cowboy_shot | white_background | smile | parted_lips | shawl | scarf | 1boy | hetero | paizuri | solo_focus | bowtie | nipples | penis | breasts_squeezed_together | mosaic_censoring | pov | sweat | closed_mouth | cum_on_breasts | torn_pantyhose | leotard_aside | vaginal | spread_legs | open_mouth | clothed_sex | bar_censor | girl_on_top | straddling | breasts_out | cum_in_pussy | huge_breasts | lying | maid_headdress | black_dress | frills | puffy_short_sleeves | ribbon | white_thighhighs | elbow_gloves | white_apron | garter_straps | holding | maid_apron | multiple_girls | long_sleeves | blue_necktie | frilled_apron | collared_shirt | school_uniform | white_shirt | pleated_skirt | black_skirt | black_footwear | black_socks | kneehighs | shoes | full_body | standing | alternate_costume | navel | stomach | white_bikini | underboob | collarbone | dark_skin | outdoors | sky | completely_nude | pussy | female_pubic_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------|:--------------------|:-----------------------------|:----------------|:--------------------|:---------------|:------------------|:--------------------|:---------------|:-----------|:--------------|:-----------------|:------------|:--------|:---------------------|:-------|:--------------|:----------------------|:-----------------|:--------|:---------------------|:------------------|:------------------|:-----------|:-------|:---------|:--------------|:--------------------|:----------------|:------------------|:--------------|:--------------|:-------------------|:--------|:--------------|:--------|:--------|:-------|:---------|:----------|:-------------|:---------|:----------|:--------|:----------------------------|:-------------------|:------|:--------|:---------------|:-----------------|:-----------------|:----------------|:----------|:--------------|:-------------|:--------------|:-------------|:--------------|:-------------|:--------------|:---------------|:---------------|:--------|:-----------------|:--------------|:---------|:----------------------|:---------|:-------------------|:---------------|:--------------|:----------------|:----------|:-------------|:-----------------|:---------------|:---------------|:----------------|:-----------------|:-----------------|:--------------|:----------------|:--------------|:-----------------|:--------------|:------------|:--------|:------------|:-----------|:--------------------|:--------|:----------|:---------------|:------------|:-------------|:------------|:-----------|:------|:------------------|:--------|:--------------------|
| 0 | 28 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | | X | X | X | | X | X | X | X | X | | | X | | | | | X | X | | | | | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | | | X | X | | | X | X | X | | | | | X | | | | | | X | | | | | | | | | X | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 17 |  |  |  |  |  | | | X | X | X | | X | X | X | | | | | X | | | | | | X | | | | | | | | | X | X | X | | | | | | | X | X | | X | | X | X | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | | X | | | | | | X | X | X | | X | X | X | | X | | | X | X | X | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 22 |  |  |  |  |  | | | | | | | | X | X | | | | | X | | | | | | X | | | | | X | | | X | | | | | X | X | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | X | | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 18 |  |  |  |  |  | | X | | | | | | X | | | X | | X | X | | | X | | X | X | X | | | | X | X | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 7 | 14 |  |  |  |  |  | X | X | | | | | | X | | X | | X | X | X | | X | X | | X | | X | | | X | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | |
| 8 | 5 |  |  |  |  |  | | | | | | | | X | | | | | | X | | | | | | X | | | | | X | | | | | | | | | X | | | | | | | | | X | | | X | | X | | | | | | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | X | X | X |
|
tappyness1/one_dash | ---
dataset_info:
features:
- name: Chapter
dtype: float64
- name: Character
dtype: string
- name: Appearance
dtype: string
- name: Appearance Notes
dtype: string
- name: Arc
dtype: string
splits:
- name: train
num_bytes: 1301962
num_examples: 23024
download_size: 202847
dataset_size: 1301962
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "one_dash"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Phantom-Artist/phantom-diffusion-s3-the-last-8-dataset | ---
license: cc0-1.0
---
Images trained for my [phantom diffusion s3 the last 8](https://huggingface.co/Phantom-Artist/phantom-diffusion-s3-the-last-8) series.
Since they are all AI generated images that are public domain under the US law, I claim it is legal to redistribute them as public domain.
However, they might have copyright in your/their original country.
Still, many countries including Japan allow us to use them for training an AI under their copyrights law, and because all the artists here are from Japan, I assume it should be allowed to reuse it for training globally.
|
BangumiBase/accelworld | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Accel World
This is the image base of bangumi Accel World, we detected 34 characters, 2098 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 146 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 8 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 614 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 140 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 55 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 27 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 8 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 58 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 47 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 16 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 21 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 99 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 13 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 8 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 23 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 10 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 27 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 429 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 14 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 17 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 14 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 28 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 6 | [Download](22/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 23 | 10 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 13 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 14 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 20 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 9 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 6 | [Download](28/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 29 | 5 | [Download](29/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 30 | 5 | [Download](30/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 31 | 10 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 7 | [Download](32/dataset.zip) |  |  |  |  |  |  |  | N/A |
| noise | 171 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
manu/mmlu_auxiliary_train_formatted_extra | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 168304845
num_examples: 19969
download_size: 99013383
dataset_size: 168304845
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
furry-br/fatty-cow | ---
license: openrail
---
|
mstz/segment | ---
license: cc-by-4.0
---
|
CyberHarem/zenobia_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of zenobia/ゼノビア/芝诺比阿 (Fate/Grand Order)
This is the dataset of zenobia/ゼノビア/芝诺比阿 (Fate/Grand Order), containing 261 images and their tags.
The core tags of this character are `long_hair, grey_hair, breasts, dark_skin, dark-skinned_female, blue_eyes, large_breasts, crown, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 261 | 463.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zenobia_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 261 | 391.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zenobia_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 651 | 751.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zenobia_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/zenobia_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, abs, bare_shoulders, black_bikini, black_gloves, black_thighhighs, chain, cleavage, cuffs, elbow_gloves, highleg_bikini, looking_at_viewer, navel, neck_ring, solo, thighs, spear |
| 1 | 8 |  |  |  |  |  | 1girl, bare_shoulders, black_bikini, black_gloves, black_thighhighs, cleavage, cuffs, elbow_gloves, highleg_bikini, looking_at_viewer, neck_ring, solo, thighs, navel, abs, gold_chain |
| 2 | 5 |  |  |  |  |  | 1girl, bare_shoulders, black_bikini, black_gloves, black_thighhighs, blush, cleavage, cuffs, elbow_gloves, highleg_bikini, looking_at_viewer, navel, neck_ring, solo, stomach, gold_chain, skindentation, ass_visible_through_thighs |
| 3 | 19 |  |  |  |  |  | 1girl, ass, bare_shoulders, black_bikini, black_gloves, chain, elbow_gloves, highleg_bikini, looking_at_viewer, solo, thighs, cuffs, looking_back, neck_ring, thong_bikini, black_thighhighs |
| 4 | 7 |  |  |  |  |  | 1girl, bare_shoulders, black_bikini, cleavage, elbow_gloves, looking_at_viewer, solo, black_gloves, neck_ring, gold_chain |
| 5 | 8 |  |  |  |  |  | 1boy, 1girl, bare_shoulders, black_bikini, cleavage, hetero, solo_focus, blush, looking_at_viewer, neck_ring, paizuri, penis, sweat, elbow_gloves, pov, black_gloves, censored, closed_mouth, gold_chain, huge_breasts, male_pubic_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | abs | bare_shoulders | black_bikini | black_gloves | black_thighhighs | chain | cleavage | cuffs | elbow_gloves | highleg_bikini | looking_at_viewer | navel | neck_ring | solo | thighs | spear | gold_chain | blush | stomach | skindentation | ass_visible_through_thighs | ass | looking_back | thong_bikini | 1boy | hetero | solo_focus | paizuri | penis | sweat | pov | censored | closed_mouth | huge_breasts | male_pubic_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------|:-----------------|:---------------|:---------------|:-------------------|:--------|:-----------|:--------|:---------------|:-----------------|:--------------------|:--------|:------------|:-------|:---------|:--------|:-------------|:--------|:----------|:----------------|:-----------------------------|:------|:---------------|:---------------|:-------|:---------|:-------------|:----------|:--------|:--------|:------|:-----------|:---------------|:---------------|:------------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | | X | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | X | X | X | | X | X | X | X | X | X | X | X | | | X | X | X | X | X | | | | | | | | | | | | | | |
| 3 | 19 |  |  |  |  |  | X | | X | X | X | X | X | | X | X | X | X | | X | X | X | | | | | | | X | X | X | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | X | X | X | | | X | | X | | X | | X | X | | | X | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | X | X | X | | | X | | X | | X | | X | | | | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
ZurabDz/tokenized_large_corpus | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 5035621680
num_examples: 5521515
download_size: 1745716024
dataset_size: 5035621680
---
# Dataset Card for "tokenized_large_corpus"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibm-nl2ui-cv/puext690refExp_simp | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: image_id
dtype: string
- name: prompt
dtype: string
- name: target_bounding_box
struct:
- name: xmax
dtype: string
- name: xmin
dtype: string
- name: ymax
dtype: string
- name: ymin
dtype: string
- name: file_name
dtype: string
splits:
- name: train
num_bytes: 4261937860.33
num_examples: 26033
- name: test
num_bytes: 578755414.046
num_examples: 2893
download_size: 2524714386
dataset_size: 4840693274.375999
---
# Dataset Card for "puext690refExp_simp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kaleemWaheed/twitter_dataset_1713018249 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 9471
num_examples: 23
download_size: 9176
dataset_size: 9471
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
weikaih/dynabench-source | ---
license: apache-2.0
---
|
stargaret/noir | ---
license: artistic-2.0
---
|
CyberHarem/repulse_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of repulse/レパルス/反击 (Azur Lane)
This is the dataset of repulse/レパルス/反击 (Azur Lane), containing 81 images and their tags.
The core tags of this character are `blue_eyes, brown_hair, breasts, bangs, short_hair, braid, hair_between_eyes, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 81 | 78.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/repulse_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 81 | 51.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/repulse_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 175 | 100.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/repulse_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 81 | 72.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/repulse_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 175 | 129.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/repulse_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/repulse_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, bare_shoulders, blue_skirt, blush, choker, halterneck, midriff, navel, solo, turret, white_thighhighs, anchor, capelet, collarbone, detached_sleeves, french_braid, long_sleeves, looking_at_viewer, machinery, miniskirt, one_eye_closed, pleated_skirt, short_ponytail, simple_background, smokestack, white_background, white_gloves, zettai_ryouiki, crop_top, full_body, grin, rigging, sidelocks, sword, ;d, cannon, chain, leg_up, one_side_up, outstretched_arm, pointing, standing_on_one_leg, stomach, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | blue_skirt | blush | choker | halterneck | midriff | navel | solo | turret | white_thighhighs | anchor | capelet | collarbone | detached_sleeves | french_braid | long_sleeves | looking_at_viewer | machinery | miniskirt | one_eye_closed | pleated_skirt | short_ponytail | simple_background | smokestack | white_background | white_gloves | zettai_ryouiki | crop_top | full_body | grin | rigging | sidelocks | sword | ;d | cannon | chain | leg_up | one_side_up | outstretched_arm | pointing | standing_on_one_leg | stomach | white_shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------------|:--------|:---------|:-------------|:----------|:--------|:-------|:---------|:-------------------|:---------|:----------|:-------------|:-------------------|:---------------|:---------------|:--------------------|:------------|:------------|:-----------------|:----------------|:-----------------|:--------------------|:-------------|:-------------------|:---------------|:-----------------|:-----------|:------------|:-------|:----------|:------------|:--------|:-----|:---------|:--------|:---------|:--------------|:-------------------|:-----------|:----------------------|:----------|:--------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
result-kand2-sdxl-wuerst-karlo/8edb1fe9 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 258
num_examples: 10
download_size: 1429
dataset_size: 258
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "8edb1fe9"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
clarin-knext/dbpedia-pl-qrels | ---
language:
- pl
---
Part of **BEIR-PL: Zero Shot Information Retrieval Benchmark for the Polish Language**.
Link to arxiv: https://arxiv.org/pdf/2305.19840.pdf
Contact: konrad.wojtasik@pwr.edu.pl |
TigerResearch/tigerbot-cmu-wiki-en | ---
license: apache-2.0
language:
- en
---
[Tigerbot](https://github.com/TigerResearch/TigerBot) 基于cmu开放在的wiki问答数据集整理的sft数据
<p align="center" width="40%">
原始来源:[http://www.cs.cmu.edu/~ark/QA-data/](http://www.cs.cmu.edu/~ark/QA-data/)
## Usage
```python
import datasets
ds_sft = datasets.load_dataset('TigerResearch/tigerbot-cmu-wiki-en')
``` |
arieg/bw_spec_cls_4_11_noise_200 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '897'
'1': '995'
'2': '997'
'3': '998'
splits:
- name: train
num_bytes: 35205161.0
num_examples: 800
- name: test
num_bytes: 880587.0
num_examples: 20
download_size: 17615671
dataset_size: 36085748.0
---
# Dataset Card for "bw_spec_cls_4_11_noise_200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
boseong/dataset_llama_bs2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 22065
num_examples: 63
download_size: 10612
dataset_size: 22065
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pranjali97/covid-qa | ---
dataset_info:
features:
- name: context
dtype: string
- name: document_id
dtype: int64
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 48664845
num_examples: 1417
- name: validation
num_bytes: 4316222
num_examples: 203
- name: test
num_bytes: 11611421
num_examples: 375
download_size: 0
dataset_size: 64592488
---
# Dataset Card for "covid-qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
luigisaetta/atco2_atcosim | ---
dataset_info:
features:
- name: id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 2049253684.428
num_examples: 8142
- name: test
num_bytes: 483912622.003
num_examples: 1957
download_size: 2521597292
dataset_size: 2533166306.4309998
---
# Dataset Card for "atco2_atcosim"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ManuelAlv/Medical_Summaries | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 13456200
num_examples: 10828
- name: validation
num_bytes: 3349088
num_examples: 2707
- name: test
num_bytes: 1136311
num_examples: 903
download_size: 9609390
dataset_size: 17941599
---
# Dataset Card for "Medical_Summaries"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sirgecko/jpvnlearning | ---
license: unlicense
---
|
guoyu-zhang/usp_1 | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 496440
num_examples: 1000
download_size: 272578
dataset_size: 496440
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
avinashmhto/gloria-gpt | ---
license: cc0-1.0
tags:
- chatGPT
---
<p align="center"><h1> Gloria ChatGPT Prompts [CSV dataset]</h1></p>
This is a Dataset Repository of **Awesome ChatGPT Prompts**
**[View All Prompts on GitHub](https://github.com/Avinash-Mahto/gloriaGPT.git)**
# License
CC-0 |
mask-distilled-one-sec-cv12/chunk_188 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 765078092
num_examples: 150251
download_size: 773129881
dataset_size: 765078092
---
# Dataset Card for "chunk_188"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MosenA/14_24_Alriyadh | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: title
dtype: string
- name: content
dtype: string
- name: date
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 2165621024
num_examples: 680084
download_size: 1013255278
dataset_size: 2165621024
---
# Dataset Card for "14_24_Alriyadh"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yale-nlp/FOLIO | ---
license: mit
---
|
ejschwartz/oo-method-test | ---
license: bsd
task_categories:
- text-classification
#task_ids:
#- binary-classification
dataset_info:
features:
- name: Binary
dtype: string
- name: Addr
dtype: string
- name: Name
dtype: string
- name: Type
dtype:
class_label:
names:
'0': func
'1': method
- name: Disassembly
dtype: string
config_name: ejschwartz--oo-method-test
splits:
- name: combined
num_bytes: 6054378861
num_examples: 3537794
download_size: 1351783459
dataset_size: 6054378861
train-eval-index:
- config: default # The dataset config name to use. Example for datasets without configs: default. Example for glue: sst2
task: text-classification # The task category name (same as task_category). Example: question-answering
task_id: binary_classification # The AutoTrain task id. Example: extractive_question_answering
splits:
#train_split: train # The split to use for training. Example: train
eval_split: train # The split to use for evaluation. Example: test
col_mapping: # The columns mapping needed to configure the task_id.
Disassembly: text
Type: target
metrics:
- type: accuracy # The metric id. Example: wer. Use metric id from https://hf.co/metrics
name: accuracy # Tne metric name to be displayed. Example: Test WER
---
# Dataset Card for OO Method Test Dataset
## Dataset Description
### Dataset Summary
This dataset describes compiled functions in various [small, simple C++ programs](https://github.com/sei-eschwartz/buildexes/tree/master/tests/src/oo).
These programs were automatically compiled using various versions of Microsoft's Visual C++ compiler and different compilation settings. The details can be found
in the [BuildExes](https://github.com/sei-eschwartz/buildexes) repository.
For each function, the dataset includes a disassembled (using ROSE's `bat-dis` tool) representation of the compiled code, its name, and whether the function is a OO method or not.
**This dataset is largely intended for @ejschwartz to experiment with learning techniques and tools. The programs are artificial and are likely not representative of real programs.**
### Supported Tasks and Leaderboards
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed] |
open-llm-leaderboard/details_N8Programs__Thestral-v0.2 | ---
pretty_name: Evaluation run of N8Programs/Thestral-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [N8Programs/Thestral-v0.2](https://huggingface.co/N8Programs/Thestral-v0.2) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_N8Programs__Thestral-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-27T17:48:42.793949](https://huggingface.co/datasets/open-llm-leaderboard/details_N8Programs__Thestral-v0.2/blob/main/results_2024-03-27T17-48-42.793949.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6259891232287084,\n\
\ \"acc_stderr\": 0.03236901406421152,\n \"acc_norm\": 0.6306542725684062,\n\
\ \"acc_norm_stderr\": 0.033007114298195854,\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5277418922277624,\n\
\ \"mc2_stderr\": 0.015223668675728603\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.575938566552901,\n \"acc_stderr\": 0.014441889627464396,\n\
\ \"acc_norm\": 0.6271331058020477,\n \"acc_norm_stderr\": 0.014131176760131167\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6357299342760406,\n\
\ \"acc_stderr\": 0.004802413919932667,\n \"acc_norm\": 0.8249352718581956,\n\
\ \"acc_norm_stderr\": 0.0037924580005234266\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800897,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800897\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.034765901043041336,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.034765901043041336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.03772446857518026,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.03772446857518026\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894442,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894442\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7967741935483871,\n \"acc_stderr\": 0.02289168798455495,\n \"\
acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.02289168798455495\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.030874145136562076,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.030874145136562076\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.617948717948718,\n \"acc_stderr\": 0.024635549163908237,\n \
\ \"acc_norm\": 0.617948717948718,\n \"acc_norm_stderr\": 0.024635549163908237\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881563,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881563\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513537,\n \
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513537\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.01626567563201035,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.01626567563201035\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.02862654791243741,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.02862654791243741\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699796,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.04742762361243011,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.04742762361243011\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.014036945850381384,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.014036945850381384\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247323,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.21787709497206703,\n\
\ \"acc_stderr\": 0.013806211780732986,\n \"acc_norm\": 0.21787709497206703,\n\
\ \"acc_norm_stderr\": 0.013806211780732986\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464485,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464485\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4641460234680574,\n\
\ \"acc_stderr\": 0.012737361318730581,\n \"acc_norm\": 0.4641460234680574,\n\
\ \"acc_norm_stderr\": 0.012737361318730581\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406762,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406762\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7910447761194029,\n\
\ \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.7910447761194029,\n\
\ \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5277418922277624,\n\
\ \"mc2_stderr\": 0.015223668675728603\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7576953433307024,\n \"acc_stderr\": 0.012042352526174792\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.45943896891584535,\n \
\ \"acc_stderr\": 0.013727093010429788\n }\n}\n```"
repo_url: https://huggingface.co/N8Programs/Thestral-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|arc:challenge|25_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|gsm8k|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hellaswag|10_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T17-48-42.793949.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T17-48-42.793949.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- '**/details_harness|winogrande|5_2024-03-27T17-48-42.793949.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-27T17-48-42.793949.parquet'
- config_name: results
data_files:
- split: 2024_03_27T17_48_42.793949
path:
- results_2024-03-27T17-48-42.793949.parquet
- split: latest
path:
- results_2024-03-27T17-48-42.793949.parquet
---
# Dataset Card for Evaluation run of N8Programs/Thestral-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [N8Programs/Thestral-v0.2](https://huggingface.co/N8Programs/Thestral-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_N8Programs__Thestral-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-27T17:48:42.793949](https://huggingface.co/datasets/open-llm-leaderboard/details_N8Programs__Thestral-v0.2/blob/main/results_2024-03-27T17-48-42.793949.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6259891232287084,
"acc_stderr": 0.03236901406421152,
"acc_norm": 0.6306542725684062,
"acc_norm_stderr": 0.033007114298195854,
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5277418922277624,
"mc2_stderr": 0.015223668675728603
},
"harness|arc:challenge|25": {
"acc": 0.575938566552901,
"acc_stderr": 0.014441889627464396,
"acc_norm": 0.6271331058020477,
"acc_norm_stderr": 0.014131176760131167
},
"harness|hellaswag|10": {
"acc": 0.6357299342760406,
"acc_stderr": 0.004802413919932667,
"acc_norm": 0.8249352718581956,
"acc_norm_stderr": 0.0037924580005234266
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800897,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800897
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.034765901043041336,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.034765901043041336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.03772446857518026,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.03772446857518026
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894442,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894442
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.02289168798455495,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.02289168798455495
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.030874145136562076,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.030874145136562076
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.617948717948718,
"acc_stderr": 0.024635549163908237,
"acc_norm": 0.617948717948718,
"acc_norm_stderr": 0.024635549163908237
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.02813325257881563,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.02813325257881563
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513537,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513537
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.01626567563201035,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.01626567563201035
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.02862654791243741,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.02862654791243741
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699796,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.04742762361243011,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.04742762361243011
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381384,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381384
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247323,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.21787709497206703,
"acc_stderr": 0.013806211780732986,
"acc_norm": 0.21787709497206703,
"acc_norm_stderr": 0.013806211780732986
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.02505850331695814,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.02505850331695814
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464485,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464485
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4641460234680574,
"acc_stderr": 0.012737361318730581,
"acc_norm": 0.4641460234680574,
"acc_norm_stderr": 0.012737361318730581
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406762,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406762
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7910447761194029,
"acc_stderr": 0.028748298931728655,
"acc_norm": 0.7910447761194029,
"acc_norm_stderr": 0.028748298931728655
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5277418922277624,
"mc2_stderr": 0.015223668675728603
},
"harness|winogrande|5": {
"acc": 0.7576953433307024,
"acc_stderr": 0.012042352526174792
},
"harness|gsm8k|5": {
"acc": 0.45943896891584535,
"acc_stderr": 0.013727093010429788
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
dhurley/medicare | ---
license: mit
---
|
gargsahil713repo/hisar_weather | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_yeen214__test_llama2_7b | ---
pretty_name: Evaluation run of yeen214/test_llama2_7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeen214/test_llama2_7b](https://huggingface.co/yeen214/test_llama2_7b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeen214__test_llama2_7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T17:29:42.839571](https://huggingface.co/datasets/open-llm-leaderboard/details_yeen214__test_llama2_7b/blob/main/results_2023-10-24T17-29-42.839571.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n\
\ \"em_stderr\": 0.0003778609196461104,\n \"f1\": 0.05606543624161075,\n\
\ \"f1_stderr\": 0.0013211107078874738,\n \"acc\": 0.4057988012013119,\n\
\ \"acc_stderr\": 0.00970458141675358\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001363255033557047,\n \"em_stderr\": 0.0003778609196461104,\n\
\ \"f1\": 0.05606543624161075,\n \"f1_stderr\": 0.0013211107078874738\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0712661106899166,\n \
\ \"acc_stderr\": 0.007086462127954491\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552667\n\
\ }\n}\n```"
repo_url: https://huggingface.co/yeen214/test_llama2_7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|arc:challenge|25_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T17_29_42.839571
path:
- '**/details_harness|drop|3_2023-10-24T17-29-42.839571.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T17-29-42.839571.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T17_29_42.839571
path:
- '**/details_harness|gsm8k|5_2023-10-24T17-29-42.839571.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T17-29-42.839571.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hellaswag|10_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T17_29_42.839571
path:
- '**/details_harness|winogrande|5_2023-10-24T17-29-42.839571.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T17-29-42.839571.parquet'
- config_name: results
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- results_2023-10-04T02-28-22.719592.parquet
- split: 2023_10_24T17_29_42.839571
path:
- results_2023-10-24T17-29-42.839571.parquet
- split: latest
path:
- results_2023-10-24T17-29-42.839571.parquet
---
# Dataset Card for Evaluation run of yeen214/test_llama2_7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeen214/test_llama2_7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeen214/test_llama2_7b](https://huggingface.co/yeen214/test_llama2_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeen214__test_llama2_7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T17:29:42.839571](https://huggingface.co/datasets/open-llm-leaderboard/details_yeen214__test_llama2_7b/blob/main/results_2023-10-24T17-29-42.839571.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196461104,
"f1": 0.05606543624161075,
"f1_stderr": 0.0013211107078874738,
"acc": 0.4057988012013119,
"acc_stderr": 0.00970458141675358
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196461104,
"f1": 0.05606543624161075,
"f1_stderr": 0.0013211107078874738
},
"harness|gsm8k|5": {
"acc": 0.0712661106899166,
"acc_stderr": 0.007086462127954491
},
"harness|winogrande|5": {
"acc": 0.7403314917127072,
"acc_stderr": 0.012322700705552667
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
NgThVinh/ValorantAgentVoiceLines | ---
dataset_info:
- config_name: astra
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 80820084.0
num_examples: 423
download_size: 0
dataset_size: 80820084.0
- config_name: breach
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 51605387.0
num_examples: 382
download_size: 0
dataset_size: 51605387.0
- config_name: brimstone
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 55140726.0
num_examples: 386
download_size: 0
dataset_size: 55140726.0
- config_name: chamber
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 59969548.0
num_examples: 351
download_size: 0
dataset_size: 59969548.0
- config_name: cypher
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 73672174.0
num_examples: 404
download_size: 69561478
dataset_size: 73672174.0
- config_name: deadlock
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 96796100.0
num_examples: 354
download_size: 84548642
dataset_size: 96796100.0
- config_name: fade
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 57426915.0
num_examples: 361
download_size: 52041862
dataset_size: 57426915.0
- config_name: gekko
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 92006966.0
num_examples: 402
download_size: 81440562
dataset_size: 92006966.0
- config_name: harbor
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 56668327.0
num_examples: 349
download_size: 54129833
dataset_size: 56668327.0
- config_name: jett
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 55791293.0
num_examples: 396
download_size: 52808521
dataset_size: 55791293.0
- config_name: kayo
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 54347793.0
num_examples: 388
download_size: 52461214
dataset_size: 54347793.0
- config_name: killjoy
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 76301591.0
num_examples: 413
download_size: 73500082
dataset_size: 76301591.0
- config_name: neon
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 48667249.0
num_examples: 379
download_size: 44390392
dataset_size: 48667249.0
- config_name: omen
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 77842248.0
num_examples: 398
download_size: 73663116
dataset_size: 77842248.0
- config_name: phoenix
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 55511767.0
num_examples: 379
download_size: 52647238
dataset_size: 55511767.0
- config_name: raze
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 70648544.0
num_examples: 418
download_size: 67349655
dataset_size: 70648544.0
- config_name: reyna
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 108953635.0
num_examples: 681
download_size: 102575408
dataset_size: 108953635.0
- config_name: sage
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 46907688.0
num_examples: 352
download_size: 45251868
dataset_size: 46907688.0
- config_name: skye
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 69454834.0
num_examples: 384
download_size: 66348392
dataset_size: 69454834.0
- config_name: sova
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 54911309.0
num_examples: 402
download_size: 52369693
dataset_size: 54911309.0
- config_name: viper
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 74295166.0
num_examples: 410
download_size: 68581546
dataset_size: 74295166.0
- config_name: yoru
features:
- name: audio_name
dtype: string
- name: audio_file
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 57015286.0
num_examples: 395
download_size: 54678076
dataset_size: 57015286.0
configs:
- config_name: astra
data_files:
- split: train
path: astra/train-*
- config_name: breach
data_files:
- split: train
path: breach/train-*
- config_name: brimstone
data_files:
- split: train
path: brimstone/train-*
- config_name: chamber
data_files:
- split: train
path: chamber/train-*
- config_name: cypher
data_files:
- split: train
path: cypher/train-*
- config_name: deadlock
data_files:
- split: train
path: deadlock/train-*
- config_name: fade
data_files:
- split: train
path: fade/train-*
- config_name: gekko
data_files:
- split: train
path: gekko/train-*
- config_name: harbor
data_files:
- split: train
path: harbor/train-*
- config_name: jett
data_files:
- split: train
path: jett/train-*
- config_name: kayo
data_files:
- split: train
path: kayo/train-*
- config_name: killjoy
data_files:
- split: train
path: killjoy/train-*
- config_name: neon
data_files:
- split: train
path: neon/train-*
- config_name: omen
data_files:
- split: train
path: omen/train-*
- config_name: phoenix
data_files:
- split: train
path: phoenix/train-*
- config_name: raze
data_files:
- split: train
path: raze/train-*
- config_name: reyna
data_files:
- split: train
path: reyna/train-*
- config_name: sage
data_files:
- split: train
path: sage/train-*
- config_name: skye
data_files:
- split: train
path: skye/train-*
- config_name: sova
data_files:
- split: train
path: sova/train-*
- config_name: viper
data_files:
- split: train
path: viper/train-*
- config_name: yoru
data_files:
- split: train
path: yoru/train-*
---
# Dataset Card for "ValorantAgentVoiceLines"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-combined | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 1633139896
num_examples: 122461
download_size: 123866375
dataset_size: 1633139896
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Mxode/StackOverflow-QA-C-Language-5k | ---
license: apache-2.0
language:
- en
tags:
- code
task_categories:
- question-answering
size_categories:
- 1K<n<10K
---
PS: More data (40k) can be found here [Mxode/StackOverflow-QA-C-Language-40k](https://huggingface.co/datasets/Mxode/StackOverflow-QA-C-Language-40k).
---
This is a collection of ~5000 QA's in **C Language** from StackOverflow. The data has been initially cleaned, and each response is with **Accepted Answer**.
All data is **<500** in length.
The questions and answers were organized into a **one-line** format. A sample format is shown below:
```json
{
"question": "```\nFILE* file = fopen(some file)\n\npcap_t* pd = pcap_fopen_offline(file)\n\npcap_close(pd)\n\nfclose(file)\n```\n\nThis code occurs double free error.\n\nCould you explain about this happening?\n\nMy Guess is that pd and file pointers are sharing some datas.\n",
"answer": "As the documentation says, thepcap_closefunction closes the files associated with thepcap_tstructure passed to it. Closing the file again withfcloseis an error.\n"
}
```
|
RocioUrquijo/en_de_TR | ---
dataset_info:
features:
- name: EN
dtype: string
- name: DE
dtype: string
splits:
- name: train
num_bytes: 167384.662601626
num_examples: 787
- name: test
num_bytes: 41899.33739837398
num_examples: 197
download_size: 142646
dataset_size: 209284.0
---
# Dataset Card for "en_de_TR"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/65_People_15204_Videos_of_Sports_and_Fitness_Video_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
65 People –15,204 Videos of Sports and Fitness Video Data. The data collection scene is indoor scenes. The race distribution is Asian, black and Caucasian; the age distribution is young and middle-aged people. The collection device is IR and RGB cameras. The dataset diversity includes different races, different age groups, different shooting angles, different collection distances, different human body orientations, different costumes and various fitness actions. The data can be used for tasks such as human behavior recognition and human segmentation in fitness scenes.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1212?source=Huggingface
## Data size
65 people, 15,204 videos
## Population distribution
race distribution: 37 Asians, 24 Caucasians, 4 Black People; gender distribution: 24 males, 41 females; age distribution: 61 young people, 4 middle-aged people
## Collecting environment:
indoor scenes
## Data diversity
different races, different age groups, different shooting angles, different collection distances, different human body orientations, different costumes and various fitness actions
## Device
infrared and color cameras, the camera resolution is 1,920x1,080
## Collecting angle
eye-level angle, simultaneous collection by three cameras (left, middle and right)
## Data format
.mp4
## Collection content
collecting fitness videos of different people under one or two sets of clothing
## Accuracy
the accuracy of video action is not less than 97%; the accuracy of label annotation is not less than 97%
# Licensing Information
Commercial License
|
CyberHarem/minamoto_no_raikou_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of minamoto_no_raikou/源頼光/源赖光 (Fate/Grand Order)
This is the dataset of minamoto_no_raikou/源頼光/源赖光 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `purple_hair, long_hair, breasts, purple_eyes, very_long_hair, parted_bangs, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 813.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minamoto_no_raikou_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 698.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minamoto_no_raikou_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1221 | 1.24 GiB | [Download](https://huggingface.co/datasets/CyberHarem/minamoto_no_raikou_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/minamoto_no_raikou_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, bare_shoulders, blue_sky, choker, collarbone, eyepatch_bikini, looking_at_viewer, purple_bikini, solo, armlet, cleavage, day, navel, beads, thighs, side-tie_bikini_bottom, smile, black_gloves, blush, low-tied_long_hair, outdoors, single_glove, parted_lips |
| 1 | 5 |  |  |  |  |  | black_serafuku, black_shirt, black_skirt, choker, crop_top_overhang, red_neckerchief, 1girl, black_sailor_collar, fingerless_gloves, long_skirt, looking_at_viewer, midriff, purple_bikini, red_gloves, smile, solo, navel, pleated_skirt, short_sleeves, thighs, beads, blue_sky, collarbone, day, low-tied_long_hair, outdoors, rope, side_slit, simple_background, single_glove |
| 2 | 7 |  |  |  |  |  | 1girl, blush, completely_nude, smile, solo, thighs, looking_at_viewer, navel, nipples, collarbone, closed_mouth, huge_breasts, sitting |
| 3 | 9 |  |  |  |  |  | 1boy, 1girl, beach, blush, day, heart, hetero, outdoors, bikini_bottom_only, choker, dark-skinned_male, huge_breasts, interracial, purple_bikini, solo_focus, sweat, topless, collarbone, sex, bikini_bottom_aside, navel, nipples, ocean, open_mouth, side-tie_bikini_bottom, water, pussy, sky, thighs, vaginal, sand, spread_legs |
| 4 | 26 |  |  |  |  |  | 1girl, blush, hetero, 1boy, completely_nude, sex, nipples, open_mouth, penis, sweat, pussy, thighs, vaginal, solo_focus, mosaic_censoring, navel, ass, looking_at_viewer |
| 5 | 6 |  |  |  |  |  | 1girl, black_gloves, fingerless_gloves, holding_sword, katana, low-tied_long_hair, purple_bodysuit, solo, arm_guards, looking_at_viewer, ribbed_sleeves, rope, tabard, kote |
| 6 | 9 |  |  |  |  |  | 1girl, black_gloves, fingerless_gloves, purple_bodysuit, solo, looking_at_viewer, simple_background, smile, ribbed_sleeves, white_background, closed_mouth, rope |
| 7 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, ribbed_sweater, smile, solo, white_sweater, long_sleeves, turtleneck_sweater, blush, bare_shoulders, low-tied_long_hair, simple_background, skirt, white_background |
| 8 | 16 |  |  |  |  |  | hetero, 1boy, 1girl, blush, huge_breasts, paizuri, erection, solo_focus, dark-skinned_male, looking_at_viewer, cum_on_breasts, interracial, uncensored, covered_nipples, ejaculation, heart, nude, open_mouth, purple_bodysuit, breasts_squeezed_together, large_penis, pov_crotch, smile |
| 9 | 10 |  |  |  |  |  | bare_shoulders, looking_at_viewer, playboy_bunny, rabbit_ears, 1girl, cleavage, detached_collar, fake_animal_ears, purple_leotard, solo, thighs, blush, bowtie, rabbit_tail, smile, fishnet_pantyhose, highleg_leotard, wrist_cuffs, gloves, high_heels, low-tied_long_hair, simple_background, strapless_leotard, white_background |
| 10 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, obi, purple_kimono, blush, hair_ornament, wide_sleeves, bare_shoulders, long_sleeves |
| 11 | 6 |  |  |  |  |  | 1girl, enmaided, maid_apron, maid_headdress, solo, looking_at_viewer, white_background, black_dress, frills, simple_background, thighhighs, blush, juliet_sleeves |
| 12 | 8 |  |  |  |  |  | 1girl, bare_shoulders, china_dress, looking_at_viewer, purple_dress, solo, smile, earrings, holding_fan, gloves, sleeveless, thighs, blush, folded_fan, sitting, covered_navel, high_heels, parted_lips |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | blue_sky | choker | collarbone | eyepatch_bikini | looking_at_viewer | purple_bikini | solo | armlet | cleavage | day | navel | beads | thighs | side-tie_bikini_bottom | smile | black_gloves | blush | low-tied_long_hair | outdoors | single_glove | parted_lips | black_serafuku | black_shirt | black_skirt | crop_top_overhang | red_neckerchief | black_sailor_collar | fingerless_gloves | long_skirt | midriff | red_gloves | pleated_skirt | short_sleeves | rope | side_slit | simple_background | completely_nude | nipples | closed_mouth | huge_breasts | sitting | 1boy | beach | heart | hetero | bikini_bottom_only | dark-skinned_male | interracial | solo_focus | sweat | topless | sex | bikini_bottom_aside | ocean | open_mouth | water | pussy | sky | vaginal | sand | spread_legs | penis | mosaic_censoring | ass | holding_sword | katana | purple_bodysuit | arm_guards | ribbed_sleeves | tabard | kote | white_background | ribbed_sweater | white_sweater | long_sleeves | turtleneck_sweater | skirt | paizuri | erection | cum_on_breasts | uncensored | covered_nipples | ejaculation | nude | breasts_squeezed_together | large_penis | pov_crotch | playboy_bunny | rabbit_ears | detached_collar | fake_animal_ears | purple_leotard | bowtie | rabbit_tail | fishnet_pantyhose | highleg_leotard | wrist_cuffs | gloves | high_heels | strapless_leotard | obi | purple_kimono | hair_ornament | wide_sleeves | enmaided | maid_apron | maid_headdress | black_dress | frills | thighhighs | juliet_sleeves | china_dress | purple_dress | earrings | holding_fan | sleeveless | folded_fan | covered_navel |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-----------------|:-----------|:---------|:-------------|:------------------|:--------------------|:----------------|:-------|:---------|:-----------|:------|:--------|:--------|:---------|:-------------------------|:--------|:---------------|:--------|:---------------------|:-----------|:---------------|:--------------|:-----------------|:--------------|:--------------|:--------------------|:------------------|:----------------------|:--------------------|:-------------|:----------|:-------------|:----------------|:----------------|:-------|:------------|:--------------------|:------------------|:----------|:---------------|:---------------|:----------|:-------|:--------|:--------|:---------|:---------------------|:--------------------|:--------------|:-------------|:--------|:----------|:------|:----------------------|:--------|:-------------|:--------|:--------|:------|:----------|:-------|:--------------|:--------|:-------------------|:------|:----------------|:---------|:------------------|:-------------|:-----------------|:---------|:-------|:-------------------|:-----------------|:----------------|:---------------|:---------------------|:--------|:----------|:-----------|:-----------------|:-------------|:------------------|:--------------|:-------|:----------------------------|:--------------|:-------------|:----------------|:--------------|:------------------|:-------------------|:-----------------|:---------|:--------------|:--------------------|:------------------|:--------------|:---------|:-------------|:--------------------|:------|:----------------|:----------------|:---------------|:-----------|:-------------|:-----------------|:--------------|:---------|:-------------|:-----------------|:--------------|:---------------|:-----------|:--------------|:-------------|:-------------|:----------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | X | X | X | | X | X | X | | | X | X | X | X | | X | | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | | | X | | X | | X | | | | X | | X | | X | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | | | X | X | | | X | | | | X | X | | X | X | | | X | | X | | | | | | | | | | | | | | | | | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 26 |  |  |  |  |  | X | | | | | | X | | | | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | X | X | | | | X | | | X | | | | X | X | | X | | | X | | X | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | | | | | X | | X | | | | | | | | | X | | X | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | | | | | | X | | X | | | | | | | | X | X | | | | | | | | | | | | X | | | | | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | | | | | X | | X | | | | | | | | X | | X | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 16 |  |  |  |  |  | X | | | | | | X | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | X | | X | | X | X | | X | X | X | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 10 |  |  |  |  |  | X | X | | | | | X | | X | | X | | | | X | | X | | X | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 10 | 11 |  |  |  |  |  | X | X | | | | | X | | X | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | |
| 11 | 6 |  |  |  |  |  | X | | | | | | X | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | |
| 12 | 8 |  |  |  |  |  | X | X | | | | | X | | X | | | | | | X | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | X | X | X | X | X | X | X |
|
ppxscal/academic_embeddings_cosimrank_bfs_small | ---
dataset_info:
features:
- name: Query Text
dtype: string
- name: Ranking 1
dtype: string
- name: Ranking 2
dtype: string
- name: Ranking 3
dtype: string
- name: Ranking 4
dtype: string
- name: Ranking 5
dtype: string
- name: Ranking 6
dtype: string
- name: Ranking 7
dtype: string
- name: Ranking 8
dtype: string
- name: Ranking 9
dtype: string
- name: Ranking 10
dtype: string
- name: Ranking 11
dtype: string
- name: Ranking 12
dtype: string
- name: Ranking 13
dtype: string
- name: score_0
dtype: float64
- name: score_1
dtype: float64
- name: score_2
dtype: float64
- name: score_3
dtype: float64
- name: score_4
dtype: float64
- name: score_5
dtype: float64
- name: score_6
dtype: float64
- name: score_7
dtype: float64
- name: score_8
dtype: float64
- name: score_9
dtype: float64
- name: score_10
dtype: float64
- name: score_11
dtype: float64
- name: score_12
dtype: float64
- name: score_13
dtype: float64
splits:
- name: train
num_bytes: 372312176
num_examples: 25895
download_size: 118175466
dataset_size: 372312176
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dhmeltzer/asks_validation_embedded | ---
dataset_info:
features:
- name: q_id
dtype: string
- name: title
dtype: string
- name: selftext
dtype: string
- name: document
dtype: string
- name: subreddit
dtype: string
- name: answers
sequence:
- name: a_id
dtype: string
- name: text
dtype: string
- name: score
dtype: int32
- name: title_urls
sequence:
- name: url
dtype: string
- name: selftext_urls
sequence:
- name: url
dtype: string
- name: answers_urls
sequence:
- name: url
dtype: string
- name: title_body
dtype: string
- name: embeddings
sequence: float32
splits:
- name: validation_asks
num_bytes: 17840672
num_examples: 2281
download_size: 15368159
dataset_size: 17840672
---
# Dataset Card for "asks_validation_embedded"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tuatruog/astro-image-dataset | ---
license: mit
---
|
benlehrburger/dreambooth-animal | ---
tags:
- pytorch
- stable diffusion
- dreambooth
- diffusion-models-class
---
# Dreambooth training set using Nala (a bunny)
Precursor to further Dreambooth experimentation
## Usage
```python
from diffusers import DDPMPipeline
pipeline = DDPMPipeline.from_pretrained('benlehrburger/dreambooth-animal')
image = pipeline().images[0]
image
``` |
raptorkwok/cantonese-traditional-chinese-parallel-corpus | ---
license: cc0-1.0
task_categories:
- translation
language:
- zh
pretty_name: Cantonese-Written Chinese Parallel Corpus
size_categories:
- 100K<n<1M
---
This is a dataset of Cantonese-Written Chinese Parallel Corpus, containing 130k+ pairs of Cantonese and Traditional Chinese parallel sentences. |
billy-bater/folio-cwa | ---
license: apache-2.0
---
|
songwenx/embeddings-test-0801 | ---
license: apache-2.0
---
|
LWades/Surface_code_and_Toric_code | ---
tags:
- Quantum
- QEC
--- |
Jojolands/Fernanda | ---
license: openrail
---
|
JanosAudran/financial-reports-sec | ---
annotations_creators:
- expert-generated
language:
- en
language_creators:
- expert-generated
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: US public firm Annual Reports (10-K)
size_categories:
- 10M<n<100M
source_datasets:
- extended|other
tags:
- "'finance"
- financial
- 10-K
- 10K
- 10k
- 10-k
- annual
- reports
- sec
- edgar
- sentiment
- firm
- public
- us'
task_categories:
- fill-mask
- text-classification
task_ids:
- masked-language-modeling
- multi-class-classification
- sentiment-classification
dataset_info:
- config_name: large_lite
features:
- name: cik
dtype: string
- name: sentence
dtype: string
- name: section
dtype:
class_label:
names:
"0": section_1
"1": section_10
"2": section_11
"3": section_12
"4": section_13
"5": section_14
"6": section_15
"7": section_1A
"8": section_1B
"9": section_2
"10": section_3
"11": section_4
"12": section_5
"13": section_6
"14": section_7
"15": section_7A
"16": section_8
"17": section_9
"18": section_9A
"19": section_9B
- name: labels
struct:
- name: 1d
dtype:
class_label:
names:
"0": positive
"1": negative
- name: 5d
dtype:
class_label:
names:
"0": positive
"1": negative
- name: 30d
dtype:
class_label:
names:
"0": positive
"1": negative
- name: filingDate
dtype: string
- name: docID
dtype: string
- name: sentenceID
dtype: string
- name: sentenceCount
dtype: int64
splits:
- name: train
num_bytes: 16424576472
num_examples: 67316227
- name: validation
num_bytes: 423527281
num_examples: 1585561
- name: test
num_bytes: 773116540
num_examples: 2965174
download_size: 13362319126
dataset_size: 17621220293
- config_name: large_full
features:
- name: cik
dtype: string
- name: sentence
dtype: string
- name: section
dtype:
class_label:
names:
"0": section_1
"1": section_10
"2": section_11
"3": section_12
"4": section_13
"5": section_14
"6": section_15
"7": section_1A
"8": section_1B
"9": section_2
"10": section_3
"11": section_4
"12": section_5
"13": section_6
"14": section_7
"15": section_7A
"16": section_8
"17": section_9
"18": section_9A
"19": section_9B
- name: labels
struct:
- name: 1d
dtype:
class_label:
names:
"0": positive
"1": negative
- name: 5d
dtype:
class_label:
names:
"0": positive
"1": negative
- name: 30d
dtype:
class_label:
names:
"0": positive
"1": negative
- name: filingDate
dtype: string
- name: name
dtype: string
- name: docID
dtype: string
- name: sentenceID
dtype: string
- name: sentenceCount
dtype: int64
- name: tickers
list: string
- name: exchanges
list: string
- name: entityType
dtype: string
- name: sic
dtype: string
- name: stateOfIncorporation
dtype: string
- name: tickerCount
dtype: int32
- name: acceptanceDateTime
dtype: string
- name: form
dtype: string
- name: reportDate
dtype: string
- name: returns
struct:
- name: 1d
struct:
- name: closePriceEndDate
dtype: float32
- name: closePriceStartDate
dtype: float32
- name: endDate
dtype: string
- name: startDate
dtype: string
- name: ret
dtype: float32
- name: 5d
struct:
- name: closePriceEndDate
dtype: float32
- name: closePriceStartDate
dtype: float32
- name: endDate
dtype: string
- name: startDate
dtype: string
- name: ret
dtype: float32
- name: 30d
struct:
- name: closePriceEndDate
dtype: float32
- name: closePriceStartDate
dtype: float32
- name: endDate
dtype: string
- name: startDate
dtype: string
- name: ret
dtype: float32
splits:
- name: train
num_bytes: 39306095718
num_examples: 67316227
- name: validation
num_bytes: 964030458
num_examples: 1585561
- name: test
num_bytes: 1785383996
num_examples: 2965174
download_size: 13362319126
dataset_size: 42055510172
- config_name: small_full
features:
- name: cik
dtype: string
- name: sentence
dtype: string
- name: section
dtype:
class_label:
names:
"0": section_1
"1": section_1A
"2": section_1B
"3": section_2
"4": section_3
"5": section_4
"6": section_5
"7": section_6
"8": section_7
"9": section_7A
"10": section_8
"11": section_9
"12": section_9A
"13": section_9B
"14": section_10
"15": section_11
"16": section_12
"17": section_13
"18": section_14
"19": section_15
- name: labels
struct:
- name: 1d
dtype:
class_label:
names:
"0": positive
"1": negative
- name: 5d
dtype:
class_label:
names:
"0": positive
"1": negative
- name: 30d
dtype:
class_label:
names:
"0": positive
"1": negative
- name: filingDate
dtype: string
- name: name
dtype: string
- name: docID
dtype: string
- name: sentenceID
dtype: string
- name: sentenceCount
dtype: int64
- name: tickers
list: string
- name: exchanges
list: string
- name: entityType
dtype: string
- name: sic
dtype: string
- name: stateOfIncorporation
dtype: string
- name: tickerCount
dtype: int32
- name: acceptanceDateTime
dtype: string
- name: form
dtype: string
- name: reportDate
dtype: string
- name: returns
struct:
- name: 1d
struct:
- name: closePriceEndDate
dtype: float32
- name: closePriceStartDate
dtype: float32
- name: endDate
dtype: string
- name: startDate
dtype: string
- name: ret
dtype: float32
- name: 5d
struct:
- name: closePriceEndDate
dtype: float32
- name: closePriceStartDate
dtype: float32
- name: endDate
dtype: string
- name: startDate
dtype: string
- name: ret
dtype: float32
- name: 30d
struct:
- name: closePriceEndDate
dtype: float32
- name: closePriceStartDate
dtype: float32
- name: endDate
dtype: string
- name: startDate
dtype: string
- name: ret
dtype: float32
splits:
- name: train
num_bytes: 128731540
num_examples: 200000
- name: validation
num_bytes: 13411689
num_examples: 20000
- name: test
num_bytes: 13188331
num_examples: 20000
download_size: 42764380
dataset_size: 155331560
- config_name: small_lite
features:
- name: cik
dtype: string
- name: sentence
dtype: string
- name: section
dtype:
class_label:
names:
"0": section_1
"1": section_1A
"2": section_1B
"3": section_2
"4": section_3
"5": section_4
"6": section_5
"7": section_6
"8": section_7
"9": section_7A
"10": section_8
"11": section_9
"12": section_9A
"13": section_9B
"14": section_10
"15": section_11
"16": section_12
"17": section_13
"18": section_14
"19": section_15
- name: labels
struct:
- name: 1d
dtype:
class_label:
names:
"0": positive
"1": negative
- name: 5d
dtype:
class_label:
names:
"0": positive
"1": negative
- name: 30d
dtype:
class_label:
names:
"0": positive
"1": negative
- name: filingDate
dtype: string
- name: docID
dtype: string
- name: sentenceID
dtype: string
- name: sentenceCount
dtype: int64
splits:
- name: train
num_bytes: 60681688
num_examples: 200000
- name: validation
num_bytes: 6677389
num_examples: 20000
- name: test
num_bytes: 6351730
num_examples: 20000
download_size: 42764380
dataset_size: 73710807
---
# Dataset Card for [financial-reports-sec]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Configurations](#dataset-configurations)
- [Usage](#usage)
- [Supported Tasks](#supported-tasks)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Summary Statistics](#dataset-summary-statistics)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [References](#references)
- [Citation Information](#citation-information)
## Dataset Description
- **Point of Contact: Aman Khan**
### Dataset Summary
The dataset contains the annual report of US public firms filing with the SEC EDGAR system from 1993-2020. Each annual report (**10K filing**) is broken into 20 sections. Each section is split into individual sentences. Sentiment labels are provided on a **per filing basis** from the market reaction around the filing date for 3 different time windows _[t-1, t+1]_, _[t-1, t+5]_ and _[t-1, t+30]_. Additional metadata for each filing is included in the dataset.
### Dataset Configurations
**Four** configurations are available:
- _**large_lite**_:
- Contains only the basic features needed. Extra metadata is ommitted.
- Features List:
- **cik**
- **sentence**
- **section**
- **labels**
- **filingDate**
- **docID**
- **sentenceID**
- **sentenceCount**
- _**large_full**_:
- All features are included.
- Features List (excluding those already in the lite verison above):
- **name**
- **tickers**
- **exchanges**
- **entityType**
- **sic**
- **stateOfIncorporation**
- **tickerCount**
- **acceptanceDateTime**
- **form**
- **reportDate**
- **returns**
- _**small_lite**_:
- Same as _**large_lite**_ version except that only (200,000/20,000/20,000) sentences are loaded for (train/test/validation) splits.
- _**small_full**_:
- Same as _**large_full**_ version except that only (200,000/20,000/20,000) sentences are loaded for (train/test/validation) splits.
### Usage
```python
import datasets
# Load the lite configuration of the dataset
raw_dataset = datasets.load_dataset("JanosAudran/financial-reports-sec", "large_lite")
# Load a specific split
raw_dataset = datasets.load_dataset("JanosAudran/financial-reports-sec", "small_full", split="train")
```
### Supported Tasks
The tasks the dataset can be used directly for includes:
- _Masked Language Modelling_
- A model like BERT can be fine-tuned on this corpus of financial text.
- _Sentiment Analysis_
- For each annual report a label ["positive", "negative"] is provided based on the market reaction around the filing date (refer to [Annotations](#annotations)).
- _Next Sentence Prediction/Sentence Order Prediction_
- Sentences extracted from the filings are in their original order and as such the dataset can be adapted very easily for either of these tasks.
### Languages
All sentences are in English.
## Dataset Structure
### Data Instances
Refer to dataset preview.
### Data Fields
**Feature Name**
- Description
- Data type
- Example/Structure
**cik**
- 10 digit identifier used by SEC for a firm.
- _string_
- '0000001750'
**sentence**
- A single sentence from the 10-K filing.
- _string_
- 'The finance agreement is secured by a first priority security interest in all insurance policies, all unearned premium, return premiums, dividend payments and loss payments thereof.'
**section**
- The section of the 10-K filing the sentence is located.
- _ClassLabel_
- ```python
ClassLabel(names=['section_1', 'section_10', 'section_11', 'section_12', 'section_13', 'section_14', 'section_15', 'section_1A', 'section_1B', 'section_2','section_3', 'section_4', 'section_5', 'section_6', 'section_7', 'section_7A','section_8', 'section_9', 'section_9A', 'section_9B'], id=None)
```
**labels**
- The sentiment label for the entire filing (_**positve**_ or _**negative**_) based on different time windows.
- _Dict of ClassLables_
- ```python
{
'1d': ClassLabel(names=['positive', 'negative'], id=None),
'5d': ClassLabel(names=['positive', 'negative'], id=None),
'30d': ClassLabel(names=['positive', 'negative'], id=None)
}
```
**filingDate**
- The date the 10-K report was filed with the SEC.
- _string_
- '2021-03-10'
**docID**
- Unique ID for identifying the exact 10-K filing. Unique across all configs and splits. Can be used to identify the document from which the sentence came from.
- _string_
- '0000001750_10-K_2020'
**sentenceID**
- Unique ID for identifying the exact sentence. Unique across all configs and splits.
- _string_
- '0000001750_10-K_2020_section_1_100'
**sentenceCount**
- Integer identiying the running sequence for the sentence. Unique **only** for a given config and split.
- _string_
- 123
**name**
- The name of the filing entity
- _string_
- 'Investar Holding Corp'
**tickers**
- List of ticker symbols for the filing entity.
- _List of strings_
- ['ISTR']
**exchanges**
- List of exchanges for the filing entity.
- _List of strings_
- ['Nasdaq']
**entityType**
- The type of entity as identified in the 10-K filing.
- _string_
- 'operating'
**sic**
- Four digit SIC code for the filing entity.
- _string_
- '6022'
**stateOfIncorporation**
- Two character code for the state of incorporation for the filing entity.
- _string_
- 'LA'
**tickerCount**
- _**Internal use**_. Count of ticker symbols. Always 1.
- _int_
- 1
**acceptanceDateTime**
- The full timestamp of when the filing was accepted into the SEC EDGAR system.
- _string_
- '2021-03-10T14:26:11.000Z'
**form**
- The type of filing. Always 10-K in the dataset.
- _string_
- '10-K'
**reportDate**
- The last date in the fiscal year for which the entity is filing the report.
- _string_
- '2020-12-31'
**returns**
- _**Internal use**_. The prices and timestamps used to calculate the sentiment labels.
- _Dict_
- ```python
{'1d': {
'closePriceEndDate': 21.45746421813965,
'closePriceStartDate': 20.64960479736328,
'endDate': '2021-03-11T00:00:00-05:00',
'startDate': '2021-03-09T00:00:00-05:00',
'ret': 0.03912226855754852
},
'5d': {
'closePriceEndDate': 21.743167877197266,
'closePriceStartDate': 20.64960479736328,
'endDate': '2021-03-15T00:00:00-04:00',
'startDate': '2021-03-09T00:00:00-05:00',
'ret': 0.052958063781261444
},
'30d': {
'closePriceEndDate': 20.63919448852539,
'closePriceStartDate': 20.64960479736328,
'endDate': '2021-04-09T00:00:00-04:00',
'startDate': '2021-03-09T00:00:00-05:00',
'ret': -0.0005041408003307879}}
```
### Data Splits
| Config | train | validation | test |
| ---------- | ---------: | ---------: | --------: |
| large_full | 67,316,227 | 1,585,561 | 2,965,174 |
| large_lite | 67,316,227 | 1,585,561 | 2,965,174 |
| small_full | 200,000 | 20,000 | 20,000 |
| small_lite | 200,000 | 20,000 | 20,000 |
### Dataset Summary Statistics
| Variable | count | mean | std | min | 1% | 25% | 50% | 75% | 99% | max |
| :-------------------------------- | ---------: | ----: | -----: | -----: | -----: | -----: | ----: | ----: | ----: | --------: |
| Unique Firm Count | 4,677 | | | | | | | | | |
| Filings Count | 55,349 | | | | | | | | | |
| Sentence Count | 71,866,962 | | | | | | | | | |
| Filings per Firm | 4,677 | 12 | 9 | 1 | 1 | 4 | 11 | 19 | 27 | 28 |
| Return per Filing - 1d | 55,349 | 0.008 | 0.394 | -0.973 | -0.253 | -0.023 | 0 | 0.02 | 0.367 | 77.977 |
| Return per Filing - 5d | 55,349 | 0.013 | 0.584 | -0.99 | -0.333 | -0.034 | 0 | 0.031 | 0.5 | 100 |
| Return per Filing - 30d | 55,349 | 0.191 | 22.924 | -0.999 | -0.548 | -0.068 | 0.001 | 0.074 | 1 | 5,002.748 |
| Sentences per Filing | 55,349 | 1,299 | 654 | 0 | 110 | 839 | 1,268 | 1,681 | 3,135 | 8,286 |
| Sentences by Section - section_1 | 55,349 | 221 | 183 | 0 | 0 | 97 | 180 | 293 | 852 | 2,724 |
| Sentences by Section - section_10 | 55,349 | 24 | 40 | 0 | 0 | 4 | 6 | 20 | 173 | 1,594 |
| Sentences by Section - section_11 | 55,349 | 16 | 47 | 0 | 0 | 3 | 3 | 4 | 243 | 808 |
| Sentences by Section - section_12 | 55,349 | 9 | 14 | 0 | 0 | 3 | 4 | 8 | 56 | 1,287 |
| Sentences by Section - section_13 | 55,349 | 8 | 20 | 0 | 0 | 3 | 3 | 4 | 79 | 837 |
| Sentences by Section - section_14 | 55,349 | 22 | 93 | 0 | 0 | 3 | 3 | 8 | 413 | 3,536 |
| Sentences by Section - section_15 | 55,349 | 177 | 267 | 0 | 0 | 9 | 26 | 315 | 1104 | 4,140 |
| Sentences by Section - section_1A | 55,349 | 197 | 204 | 0 | 0 | 3 | 158 | 292 | 885 | 2,106 |
| Sentences by Section - section_1B | 55,349 | 4 | 31 | 0 | 0 | 1 | 3 | 3 | 13 | 2,414 |
| Sentences by Section - section_2 | 55,349 | 16 | 45 | 0 | 0 | 6 | 8 | 13 | 169 | 1,903 |
| Sentences by Section - section_3 | 55,349 | 14 | 36 | 0 | 0 | 4 | 5 | 12 | 121 | 2,326 |
| Sentences by Section - section_4 | 55,349 | 7 | 17 | 0 | 0 | 3 | 3 | 4 | 66 | 991 |
| Sentences by Section - section_5 | 55,349 | 20 | 41 | 0 | 0 | 10 | 15 | 21 | 87 | 3,816 |
| Sentences by Section - section_6 | 55,349 | 8 | 29 | 0 | 0 | 3 | 4 | 7 | 43 | 2,156 |
| Sentences by Section - section_7 | 55,349 | 265 | 198 | 0 | 0 | 121 | 246 | 373 | 856 | 4,539 |
| Sentences by Section - section_7A | 55,349 | 18 | 52 | 0 | 0 | 3 | 9 | 21 | 102 | 3,596 |
| Sentences by Section - section_8 | 55,349 | 257 | 296 | 0 | 0 | 3 | 182 | 454 | 1105 | 4,431 |
| Sentences by Section - section_9 | 55,349 | 5 | 33 | 0 | 0 | 3 | 3 | 4 | 18 | 2,330 |
| Sentences by Section - section_9A | 55,349 | 17 | 16 | 0 | 0 | 8 | 15 | 23 | 50 | 794 |
| Sentences by Section - section_9B | 55,349 | 4 | 18 | 0 | 0 | 2 | 3 | 4 | 23 | 813 |
| Word count per Sentence | 71,866,962 | 28 | 22 | 1 | 2 | 16 | 24 | 34 | 98 | 8,675 |
## Dataset Creation
### Curation Rationale
To create this dataset multiple sources of information have to be cleaned and processed for data merging. Starting from the raw filings:
- Useful metadata about the filing and firm was added.
- Time windows around the filing date were carefully created.
- Stock price data was then added for the windows.
- Ambiguous/duplicate records were removed.
### Source Data
#### Initial Data Collection and Normalization
Initial data was collected and processed by the authors of the research paper [**EDGAR-CORPUS: Billions of Tokens Make The World Go Round**](#references). Market price and returns data was collected from Yahoo Finance. Additional metadata was collected from SEC.
#### Who are the source language producers?
US public firms filing with the SEC.
### Annotations
#### Annotation process
Labels for sentiment classification are based on buy-and-hold returns over a fixed time window around the filing date with the SEC i.e. when the data becomes public. Returns are chosen for this process as it reflects the combined market intelligence at parsing the new information in the filings. For each filing date **t** the stock price at **t-1** and **t+W** is used to calculate returns. If, the returns are positive a label of **positive** is assigned else a label of **negative** is assigned. Three different windows are used to assign the labels:
- **1d**: _[t-1, t+1]_
- **5d**: _[t-1, t+5]_
- **30d**: _[t-1, t+30]_
The windows are based on calendar days and are adjusted for weekends and holidays. The rationale behind using 3 windows is as follows:
- A very short window may not give enough time for all the information contained in the filing to be reflected in the stock price.
- A very long window may capture other events that drive stock price for the firm.
#### Who are the annotators?
Financial market participants.
### Personal and Sensitive Information
The dataset contains public filings data from SEC. Market returns data was collected from Yahoo Finance.
## Considerations for Using the Data
### Social Impact of Dataset
Low to none.
### Discussion of Biases
The dataset is about financial information of public companies and as such the tone and style of text is in line with financial literature.
### Other Known Limitations
NA
## Additional Information
### Dataset Curators
**Aman Khan**
### Licensing Information
This dataset is provided under Apache 2.0.
### References
- Lefteris Loukas, Manos Fergadiotis, Ion Androutsopoulos, & Prodromos Malakasiotis. (2021). EDGAR-CORPUS [Data set]. Zenodo. https://doi.org/10.5281/zenodo.5589195
### Citation Information
Please use the following to cite this dataset:
```
@ONLINE{financial-reports-sec,
author = "Aman Khan",
title = "Financial Reports SEC",
url = "https://huggingface.co/datasets/JanosAudran/financial-reports-sec"
}
```
|
open-llm-leaderboard/details_unaidedelf87777__wizard-mistral-v0.1 | ---
pretty_name: Evaluation run of unaidedelf87777/wizard-mistral-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [unaidedelf87777/wizard-mistral-v0.1](https://huggingface.co/unaidedelf87777/wizard-mistral-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_unaidedelf87777__wizard-mistral-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T00:26:05.989697](https://huggingface.co/datasets/open-llm-leaderboard/details_unaidedelf87777__wizard-mistral-v0.1/blob/main/results_2023-10-24T00-26-05.989697.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.005662751677852349,\n\
\ \"em_stderr\": 0.0007684582267637443,\n \"f1\": 0.07014261744966427,\n\
\ \"f1_stderr\": 0.0015546181894855703,\n \"acc\": 0.4866237666597055,\n\
\ \"acc_stderr\": 0.011199109496696186\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.005662751677852349,\n \"em_stderr\": 0.0007684582267637443,\n\
\ \"f1\": 0.07014261744966427,\n \"f1_stderr\": 0.0015546181894855703\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19029567854435178,\n \
\ \"acc_stderr\": 0.010812347283182963\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.01158587171020941\n\
\ }\n}\n```"
repo_url: https://huggingface.co/unaidedelf87777/wizard-mistral-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|arc:challenge|25_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T00_26_05.989697
path:
- '**/details_harness|drop|3_2023-10-24T00-26-05.989697.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T00-26-05.989697.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T00_26_05.989697
path:
- '**/details_harness|gsm8k|5_2023-10-24T00-26-05.989697.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T00-26-05.989697.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hellaswag|10_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T00_26_05.989697
path:
- '**/details_harness|winogrande|5_2023-10-24T00-26-05.989697.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T00-26-05.989697.parquet'
- config_name: results
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- results_2023-10-11T22-55-59.459837.parquet
- split: 2023_10_24T00_26_05.989697
path:
- results_2023-10-24T00-26-05.989697.parquet
- split: latest
path:
- results_2023-10-24T00-26-05.989697.parquet
---
# Dataset Card for Evaluation run of unaidedelf87777/wizard-mistral-v0.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/unaidedelf87777/wizard-mistral-v0.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [unaidedelf87777/wizard-mistral-v0.1](https://huggingface.co/unaidedelf87777/wizard-mistral-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_unaidedelf87777__wizard-mistral-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T00:26:05.989697](https://huggingface.co/datasets/open-llm-leaderboard/details_unaidedelf87777__wizard-mistral-v0.1/blob/main/results_2023-10-24T00-26-05.989697.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.005662751677852349,
"em_stderr": 0.0007684582267637443,
"f1": 0.07014261744966427,
"f1_stderr": 0.0015546181894855703,
"acc": 0.4866237666597055,
"acc_stderr": 0.011199109496696186
},
"harness|drop|3": {
"em": 0.005662751677852349,
"em_stderr": 0.0007684582267637443,
"f1": 0.07014261744966427,
"f1_stderr": 0.0015546181894855703
},
"harness|gsm8k|5": {
"acc": 0.19029567854435178,
"acc_stderr": 0.010812347283182963
},
"harness|winogrande|5": {
"acc": 0.7829518547750592,
"acc_stderr": 0.01158587171020941
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_CorticalStack__mistral-7b-openhermes-sft | ---
pretty_name: Evaluation run of CorticalStack/mistral-7b-openhermes-sft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CorticalStack/mistral-7b-openhermes-sft](https://huggingface.co/CorticalStack/mistral-7b-openhermes-sft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CorticalStack__mistral-7b-openhermes-sft\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-16T14:13:04.061725](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-openhermes-sft/blob/main/results_2024-02-16T14-13-04.061725.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6070598204374664,\n\
\ \"acc_stderr\": 0.03297690039129263,\n \"acc_norm\": 0.6130046390646828,\n\
\ \"acc_norm_stderr\": 0.033660154914381686,\n \"mc1\": 0.31456548347613217,\n\
\ \"mc1_stderr\": 0.016255241993179185,\n \"mc2\": 0.4630793817398098,\n\
\ \"mc2_stderr\": 0.014741207245405565\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5776450511945392,\n \"acc_stderr\": 0.01443413871337998,\n\
\ \"acc_norm\": 0.60580204778157,\n \"acc_norm_stderr\": 0.01428052266746732\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6233817964548894,\n\
\ \"acc_stderr\": 0.004835475957610925,\n \"acc_norm\": 0.8200557657837084,\n\
\ \"acc_norm_stderr\": 0.003833559228158668\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.039105257528497236,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.039105257528497236\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067884,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067884\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7096774193548387,\n\
\ \"acc_stderr\": 0.02582210611941589,\n \"acc_norm\": 0.7096774193548387,\n\
\ \"acc_norm_stderr\": 0.02582210611941589\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959215,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959215\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153303,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153303\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110943,\n\
\ \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110943\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.592436974789916,\n \"acc_stderr\": 0.03191863374478466,\n \
\ \"acc_norm\": 0.592436974789916,\n \"acc_norm_stderr\": 0.03191863374478466\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7779816513761468,\n \"acc_stderr\": 0.017818849564796634,\n \"\
acc_norm\": 0.7779816513761468,\n \"acc_norm_stderr\": 0.017818849564796634\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n\
\ \"acc_stderr\": 0.029771775228145624,\n \"acc_norm\": 0.7647058823529411,\n\
\ \"acc_norm_stderr\": 0.029771775228145624\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n\
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.04453197507374984,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.04453197507374984\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459754,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459754\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8020434227330779,\n\
\ \"acc_stderr\": 0.01424887354921758,\n \"acc_norm\": 0.8020434227330779,\n\
\ \"acc_norm_stderr\": 0.01424887354921758\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879716,\n\
\ \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879716\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27039106145251396,\n\
\ \"acc_stderr\": 0.014854993938010073,\n \"acc_norm\": 0.27039106145251396,\n\
\ \"acc_norm_stderr\": 0.014854993938010073\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.026643278474508755,\n\
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.026643278474508755\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900922,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900922\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291463,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291463\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4380704041720991,\n\
\ \"acc_stderr\": 0.012671902782567654,\n \"acc_norm\": 0.4380704041720991,\n\
\ \"acc_norm_stderr\": 0.012671902782567654\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681393,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681393\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6176470588235294,\n \"acc_stderr\": 0.01965992249362335,\n \
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.01965992249362335\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31456548347613217,\n\
\ \"mc1_stderr\": 0.016255241993179185,\n \"mc2\": 0.4630793817398098,\n\
\ \"mc2_stderr\": 0.014741207245405565\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774104\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.312357846853677,\n \
\ \"acc_stderr\": 0.012765850404191427\n }\n}\n```"
repo_url: https://huggingface.co/CorticalStack/mistral-7b-openhermes-sft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|arc:challenge|25_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|gsm8k|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hellaswag|10_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T14-13-04.061725.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T14-13-04.061725.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- '**/details_harness|winogrande|5_2024-02-16T14-13-04.061725.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-16T14-13-04.061725.parquet'
- config_name: results
data_files:
- split: 2024_02_16T14_13_04.061725
path:
- results_2024-02-16T14-13-04.061725.parquet
- split: latest
path:
- results_2024-02-16T14-13-04.061725.parquet
---
# Dataset Card for Evaluation run of CorticalStack/mistral-7b-openhermes-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CorticalStack/mistral-7b-openhermes-sft](https://huggingface.co/CorticalStack/mistral-7b-openhermes-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CorticalStack__mistral-7b-openhermes-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T14:13:04.061725](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-openhermes-sft/blob/main/results_2024-02-16T14-13-04.061725.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6070598204374664,
"acc_stderr": 0.03297690039129263,
"acc_norm": 0.6130046390646828,
"acc_norm_stderr": 0.033660154914381686,
"mc1": 0.31456548347613217,
"mc1_stderr": 0.016255241993179185,
"mc2": 0.4630793817398098,
"mc2_stderr": 0.014741207245405565
},
"harness|arc:challenge|25": {
"acc": 0.5776450511945392,
"acc_stderr": 0.01443413871337998,
"acc_norm": 0.60580204778157,
"acc_norm_stderr": 0.01428052266746732
},
"harness|hellaswag|10": {
"acc": 0.6233817964548894,
"acc_stderr": 0.004835475957610925,
"acc_norm": 0.8200557657837084,
"acc_norm_stderr": 0.003833559228158668
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.039105257528497236,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.039105257528497236
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067884,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067884
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7096774193548387,
"acc_stderr": 0.02582210611941589,
"acc_norm": 0.7096774193548387,
"acc_norm_stderr": 0.02582210611941589
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153303,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153303
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110943,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110943
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.592436974789916,
"acc_stderr": 0.03191863374478466,
"acc_norm": 0.592436974789916,
"acc_norm_stderr": 0.03191863374478466
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7779816513761468,
"acc_stderr": 0.017818849564796634,
"acc_norm": 0.7779816513761468,
"acc_norm_stderr": 0.017818849564796634
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145624,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145624
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.04453197507374984,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.04453197507374984
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459754,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459754
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8020434227330779,
"acc_stderr": 0.01424887354921758,
"acc_norm": 0.8020434227330779,
"acc_norm_stderr": 0.01424887354921758
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.025305258131879716,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.025305258131879716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27039106145251396,
"acc_stderr": 0.014854993938010073,
"acc_norm": 0.27039106145251396,
"acc_norm_stderr": 0.014854993938010073
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.026643278474508755,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.026643278474508755
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900922,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291463,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291463
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4380704041720991,
"acc_stderr": 0.012671902782567654,
"acc_norm": 0.4380704041720991,
"acc_norm_stderr": 0.012671902782567654
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.029029422815681393,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.029029422815681393
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.01965992249362335,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.01965992249362335
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31456548347613217,
"mc1_stderr": 0.016255241993179185,
"mc2": 0.4630793817398098,
"mc2_stderr": 0.014741207245405565
},
"harness|winogrande|5": {
"acc": 0.7758484609313339,
"acc_stderr": 0.011720400740774104
},
"harness|gsm8k|5": {
"acc": 0.312357846853677,
"acc_stderr": 0.012765850404191427
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
sk-uma/controlnet-hair | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 1766724305.824
num_examples: 4573
download_size: 1776442522
dataset_size: 1766724305.824
---
# Dataset Card for "controlnet-hair"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/lmind_hotpot_train1000_eval200_v1_reciteonly_qa | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
splits:
- name: train_qa
num_bytes: 173266
num_examples: 1000
- name: train_recite_qa
num_bytes: 1024784
num_examples: 1000
- name: eval_qa
num_bytes: 33160
num_examples: 200
- name: eval_recite_qa
num_bytes: 208740
num_examples: 200
- name: all_docs
num_bytes: 1054269
num_examples: 2373
- name: train
num_bytes: 1024784
num_examples: 1000
- name: validation
num_bytes: 208740
num_examples: 200
download_size: 2341351
dataset_size: 3727743
---
# Dataset Card for "lmind_hotpot_train1000_eval200_v1_reciteonly_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
taesiri/arxiv_audio_archived | ---
license: apache-2.0
---
|
soyasis/wikihow_small | ---
language: en
license: mit
---
# WikiHow Entries
Contains wikiHow question, answer and summary in `.json` format. |
louisbrulenaudet/code-tourisme | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code du tourisme
source_datasets:
- original
pretty_name: Code du tourisme
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code du tourisme, non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
mozilla-foundation/common_voice_15_0 | ---
pretty_name: Common Voice Corpus 15
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language_bcp47:
- ab
- af
- am
- ar
- as
- ast
- az
- ba
- bas
- be
- bg
- bn
- br
- ca
- ckb
- cnh
- cs
- cv
- cy
- da
- de
- dv
- dyu
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fr
- fy-NL
- ga-IE
- gl
- gn
- ha
- he
- hi
- hsb
- hu
- hy-AM
- ia
- id
- ig
- is
- it
- ja
- ka
- kab
- kk
- kmr
- ko
- ky
- lg
- lo
- lt
- lv
- mdf
- mhr
- mk
- ml
- mn
- mr
- mrj
- mt
- myv
- nan-tw
- ne-NP
- nl
- nn-NO
- oc
- or
- pa-IN
- pl
- ps
- pt
- quy
- rm-sursilv
- rm-vallader
- ro
- ru
- rw
- sah
- sat
- sc
- sk
- skr
- sl
- sq
- sr
- sv-SE
- sw
- ta
- th
- ti
- tig
- tk
- tok
- tr
- tt
- tw
- ug
- uk
- ur
- uz
- vi
- vot
- yo
- yue
- zgh
- zh-CN
- zh-HK
- zh-TW
license:
- cc0-1.0
multilinguality:
- multilingual
source_datasets:
- extended|common_voice
task_categories:
- automatic-speech-recognition
paperswithcode_id: common-voice
extra_gated_prompt: "By clicking on “Access repository” below, you also agree to not attempt to determine the identity of speakers in the Common Voice dataset."
---
# Dataset Card for Common Voice Corpus 15
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://commonvoice.mozilla.org/en/datasets
- **Repository:** https://github.com/common-voice/common-voice
- **Paper:** https://arxiv.org/abs/1912.06670
- **Leaderboard:** https://paperswithcode.com/dataset/common-voice
- **Point of Contact:** [Vaibhav Srivastav](mailto:vaibhav@huggingface.co)
### Dataset Summary
The Common Voice dataset consists of a unique MP3 and corresponding text file.
Many of the 28750 recorded hours in the dataset also include demographic metadata like age, sex, and accent
that can help improve the accuracy of speech recognition engines.
The dataset currently consists of 19159 validated hours in 114 languages, but more voices and languages are always added.
Take a look at the [Languages](https://commonvoice.mozilla.org/en/languages) page to request a language or start contributing.
### Supported Tasks and Leaderboards
The results for models trained on the Common Voice datasets are available via the
[🤗 Speech Bench](https://huggingface.co/spaces/huggingface/hf-speech-bench)
### Languages
```
Abkhaz, Afrikaans, Albanian, Amharic, Arabic, Armenian, Assamese, Asturian, Azerbaijani, Basaa, Bashkir, Basque, Belarusian, Bengali, Breton, Bulgarian, Cantonese, Catalan, Central Kurdish, Chinese (China), Chinese (Hong Kong), Chinese (Taiwan), Chuvash, Czech, Danish, Dhivehi, Dioula, Dutch, English, Erzya, Esperanto, Estonian, Finnish, French, Frisian, Galician, Georgian, German, Greek, Guarani, Hakha Chin, Hausa, Hebrew, Hill Mari, Hindi, Hungarian, Icelandic, Igbo, Indonesian, Interlingua, Irish, Italian, Japanese, Kabyle, Kazakh, Kinyarwanda, Korean, Kurmanji Kurdish, Kyrgyz, Lao, Latvian, Lithuanian, Luganda, Macedonian, Malayalam, Maltese, Marathi, Meadow Mari, Moksha, Mongolian, Nepali, Norwegian Nynorsk, Occitan, Odia, Pashto, Persian, Polish, Portuguese, Punjabi, Quechua Chanka, Romanian, Romansh Sursilvan, Romansh Vallader, Russian, Sakha, Santali (Ol Chiki), Saraiki, Sardinian, Serbian, Slovak, Slovenian, Sorbian, Upper, Spanish, Swahili, Swedish, Taiwanese (Minnan), Tamazight, Tamil, Tatar, Thai, Tigre, Tigrinya, Toki Pona, Turkish, Turkmen, Twi, Ukrainian, Urdu, Uyghur, Uzbek, Vietnamese, Votic, Welsh, Yoruba
```
## How to use
The `datasets` library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the `load_dataset` function.
For example, to download the Hindi config, simply specify the corresponding language config name (i.e., "hi" for Hindi):
```python
from datasets import load_dataset
cv_15 = load_dataset("mozilla-foundation/common_voice_15_0", "hi", split="train")
```
Using the datasets library, you can also stream the dataset on-the-fly by adding a `streaming=True` argument to the `load_dataset` function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.
```python
from datasets import load_dataset
cv_15 = load_dataset("mozilla-foundation/common_voice_15_0", "hi", split="train", streaming=True)
print(next(iter(cv_14)))
```
*Bonus*: create a [PyTorch dataloader](https://huggingface.co/docs/datasets/use_with_pytorch) directly with your own datasets (local/streamed).
### Local
```python
from datasets import load_dataset
from torch.utils.data.sampler import BatchSampler, RandomSampler
cv_15 = load_dataset("mozilla-foundation/common_voice_15_0", "hi", split="train")
batch_sampler = BatchSampler(RandomSampler(cv_15), batch_size=32, drop_last=False)
dataloader = DataLoader(cv_15, batch_sampler=batch_sampler)
```
### Streaming
```python
from datasets import load_dataset
from torch.utils.data import DataLoader
cv_15 = load_dataset("mozilla-foundation/common_voice_15_0", "hi", split="train")
dataloader = DataLoader(cv_15, batch_size=32)
```
To find out more about loading and preparing audio datasets, head over to [hf.co/blog/audio-datasets](https://huggingface.co/blog/audio-datasets).
### Example scripts
Train your own CTC or Seq2Seq Automatic Speech Recognition models on Common Voice 13 with `transformers` - [here](https://github.com/huggingface/transformers/tree/main/examples/pytorch/speech-recognition).
## Dataset Structure
### Data Instances
A typical data point comprises the `path` to the audio file and its `sentence`.
Additional fields include `accent`, `age`, `client_id`, `up_votes`, `down_votes`, `gender`, `locale` and `segment`.
```python
{
'client_id': 'd59478fbc1ee646a28a3c652a119379939123784d99131b865a89f8b21c81f69276c48bd574b81267d9d1a77b83b43e6d475a6cfc79c232ddbca946ae9c7afc5',
'path': 'et/clips/common_voice_et_18318995.mp3',
'audio': {
'path': 'et/clips/common_voice_et_18318995.mp3',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 48000
},
'sentence': 'Tasub kokku saada inimestega, keda tunned juba ammust ajast saati.',
'up_votes': 2,
'down_votes': 0,
'age': 'twenties',
'gender': 'male',
'accent': '',
'locale': 'et',
'segment': ''
}
```
## Dataset Structure
### Data Instances
A typical data point comprises the `path` to the audio file and its `sentence`.
Additional fields include `accent`, `age`, `client_id`, `up_votes`, `down_votes`, `gender`, `locale` and `segment`.
```python
{
'client_id': 'd59478fbc1ee646a28a3c652a119379939123784d99131b865a89f8b21c81f69276c48bd574b81267d9d1a77b83b43e6d475a6cfc79c232ddbca946ae9c7afc5',
'path': 'et/clips/common_voice_et_18318995.mp3',
'audio': {
'path': 'et/clips/common_voice_et_18318995.mp3',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 48000
},
'sentence': 'Tasub kokku saada inimestega, keda tunned juba ammust ajast saati.',
'up_votes': 2,
'down_votes': 0,
'age': 'twenties',
'gender': 'male',
'accent': '',
'locale': 'et',
'segment': ''
}
```
### Data Fields
`client_id` (`string`): An id for which client (voice) made the recording
`path` (`string`): The path to the audio file
`audio` (`dict`): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
`sentence` (`string`): The sentence the user was prompted to speak
`up_votes` (`int64`): How many upvotes the audio file has received from reviewers
`down_votes` (`int64`): How many downvotes the audio file has received from reviewers
`age` (`string`): The age of the speaker (e.g. `teens`, `twenties`, `fifties`)
`gender` (`string`): The gender of the speaker
`accent` (`string`): Accent of the speaker
`locale` (`string`): The locale of the speaker
`segment` (`string`): Usually an empty field
### Data Splits
The speech material has been subdivided into portions for dev, train, test, validated, invalidated, reported and other.
The validated data is data that has been validated with reviewers and received upvotes that the data is of high quality.
The invalidated data is data has been invalidated by reviewers
and received downvotes indicating that the data is of low quality.
The reported data is data that has been reported, for different reasons.
The other data is data that has not yet been reviewed.
The dev, test, train are all data that has been reviewed, deemed of high quality and split into dev, test and train.
## Data Preprocessing Recommended by Hugging Face
The following are data preprocessing steps advised by the Hugging Face team. They are accompanied by an example code snippet that shows how to put them to practice.
Many examples in this dataset have trailing quotations marks, e.g _“the cat sat on the mat.“_. These trailing quotation marks do not change the actual meaning of the sentence, and it is near impossible to infer whether a sentence is a quotation or not a quotation from audio data alone. In these cases, it is advised to strip the quotation marks, leaving: _the cat sat on the mat_.
In addition, the majority of training sentences end in punctuation ( . or ? or ! ), whereas just a small proportion do not. In the dev set, **almost all** sentences end in punctuation. Thus, it is recommended to append a full-stop ( . ) to the end of the small number of training examples that do not end in punctuation.
```python
from datasets import load_dataset
ds = load_dataset("mozilla-foundation/common_voice_15_0", "en", use_auth_token=True)
def prepare_dataset(batch):
"""Function to preprocess the dataset with the .map method"""
transcription = batch["sentence"]
if transcription.startswith('"') and transcription.endswith('"'):
# we can remove trailing quotation marks as they do not affect the transcription
transcription = transcription[1:-1]
if transcription[-1] not in [".", "?", "!"]:
# append a full-stop to sentences that do not end in punctuation
transcription = transcription + "."
batch["sentence"] = transcription
return batch
ds = ds.map(prepare_dataset, desc="preprocess dataset")
```
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
## Considerations for Using the Data
### Social Impact of Dataset
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Public Domain, [CC-0](https://creativecommons.org/share-your-work/public-domain/cc0/)
### Citation Information
```
@inproceedings{commonvoice:2020,
author = {Ardila, R. and Branson, M. and Davis, K. and Henretty, M. and Kohler, M. and Meyer, J. and Morais, R. and Saunders, L. and Tyers, F. M. and Weber, G.},
title = {Common Voice: A Massively-Multilingual Speech Corpus},
booktitle = {Proceedings of the 12th Conference on Language Resources and Evaluation (LREC 2020)},
pages = {4211--4215},
year = 2020
}
```
|
cartesinus/iva_mt_wslot | ---
dataset_info:
features:
- name: id
dtype: string
- name: locale
dtype: string
- name: origin
dtype: string
- name: partition
dtype: string
- name: translation_utt
dtype:
translation:
languages:
- en
- pl
- name: translation_xml
dtype:
translation:
languages:
- en
- pl
- name: src_bio
dtype: string
- name: tgt_bio
dtype: string
splits:
- name: train
num_bytes: 6187206
num_examples: 20362
- name: validation
num_bytes: 1115480
num_examples: 3681
- name: test
num_bytes: 1587613
num_examples: 5394
download_size: 3851892
dataset_size: 8890299
task_categories:
- translation
language:
- en
- pl
- de
- es
- sv
- fr
- pt
tags:
- machine translation
- nlu
- natural-language-understanding
- virtual assistant
pretty_name: Machine translation for NLU with slot transfer
size_categories:
- 10K<n<100K
license: cc-by-4.0
---
# Machine translation dataset for NLU (Virual Assistant) with slot transfer between languages
version: 0.5.1
## Dataset Summary
Disclaimer: This is for research purposes only. Please have a look at the license section below. Some of the datasets used to construct IVA_MT have an unknown license.
IVA_MT is a machine translation dataset that can be used to train, adapt and evaluate MT models used in Virtual Assistant NLU context (e.g. to translate trainig corpus of NLU).
## Dataset Composition
### en-pl
| Corpus | Train | Dev | Test |
|----------------------------------------------------------------------|--------|-------|-------|
| [Massive 1.1](https://huggingface.co/datasets/AmazonScience/massive) | 11514 | 2033 | 2974 |
| [Leyzer 0.2.0](https://github.com/cartesinus/leyzer/tree/0.2.0) | 3974 | 701 | 1380 |
| [OpenSubtitles from OPUS](https://opus.nlpl.eu/OpenSubtitles-v1.php) | 2329 | 411 | 500 |
| [KDE from OPUS](https://opus.nlpl.eu/KDE4.php) | 1154 | 241 | 241 |
| [CCMatrix from Opus](https://opus.nlpl.eu/CCMatrix.php) | 1096 | 232 | 237 |
| [Ubuntu from OPUS](https://opus.nlpl.eu/Ubuntu.php) | 281 | 60 | 59 |
| [Gnome from OPUS](https://opus.nlpl.eu/GNOME.php) | 14 | 3 | 3 |
| *total* | 20362 | 3681 | 5394 |
### en-de
| Corpus | Train | Dev | Test |
|----------------------------------------------------------------------|--------|-------|-------|
| [Massive 1.1](https://huggingface.co/datasets/AmazonScience/massive) | 7536 | 1346 | 1955 |
### en-es
| Corpus | Train | Dev | Test |
|----------------------------------------------------------------------|--------|-------|-------|
| [Massive 1.1](https://huggingface.co/datasets/AmazonScience/massive) | 8415 | 1526 | 2202 |
### en-sv
| Corpus | Train | Dev | Test |
|----------------------------------------------------------------------|--------|-------|-------|
| [Massive 1.1](https://huggingface.co/datasets/AmazonScience/massive) | 7540 | 1360 | 1921 |
### en-fr
| Corpus | Train | Dev | Test |
|----------------------------------------------------------------------|--------|-------|-------|
| [Massive 1.1](https://huggingface.co/datasets/AmazonScience/massive) | 6800 | 1203 | 1757 |
### en-pt
| Corpus | Train | Dev | Test |
|----------------------------------------------------------------------|--------|-------|-------|
| [Massive 1.1](https://huggingface.co/datasets/AmazonScience/massive) | 7368 | 1296 | 1885 |
### en-hi
| Corpus | Train | Dev | Test |
|----------------------------------------------------------------------|--------|-------|-------|
| [Massive 1.1](https://huggingface.co/datasets/AmazonScience/massive) | 6702 | 1175 | 1747 |
### en-tr
| Corpus | Train | Dev | Test |
|----------------------------------------------------------------------|--------|-------|-------|
| [Massive 1.1](https://huggingface.co/datasets/AmazonScience/massive) | 8269 | 1474 | 2170 |
### en-ja
| Corpus | Train | Dev | Test |
|----------------------------------------------------------------------|--------|-------|-------|
| [Massive 1.1](https://huggingface.co/datasets/AmazonScience/massive) | 8066 | 1434 | 2085 |
### en-zh
| Corpus | Train | Dev | Test |
|----------------------------------------------------------------------|--------|-------|-------|
| [Massive 1.1](https://huggingface.co/datasets/AmazonScience/massive) | 8433 | 1513 | 2179 |
| ChatGPT | 1312 | 200 | 200 |
## Tools
Scripts used to generate this dataset can be found on [github](https://github.com/cartesinus/iva_mt).
## Citation
If you use this models please cite:
```
@article{Sowanski2023SlotLI,
title={Slot Lost in Translation? Not Anymore: A Machine Translation Model for Virtual Assistants with Type-Independent Slot Transfer},
author={Marcin Sowanski and Artur Janicki},
journal={2023 30th International Conference on Systems, Signals and Image Processing (IWSSIP)},
year={2023},
pages={1-5}
}
```
## License
This is a composition of 7 datasets, and the license is as defined in original release:
- MASSIVE: [CC-BY 4.0](https://huggingface.co/datasets/AmazonScience/massive/blob/main/LICENSE)
- Leyzer: [CC BY-NC 4.0](https://github.com/cartesinus/leyzer/blob/master/LICENSE)
- OpenSubtitles: unknown
- KDE: [GNU Public License](https://l10n.kde.org/about.php)
- CCMatrix: no license given, therefore assuming it is LASER project license [BSD](https://github.com/facebookresearch/LASER/blob/main/LICENSE)
- Ubuntu: [GNU Public License](https://help.launchpad.net/Legal)
- Gnome: unknown
|
quocanh34/viet_vivos | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 1538496458.734306
num_examples: 9964
- name: test
num_bytes: 80709780.0
num_examples: 686
- name: validation
num_bytes: 107697815.0
num_examples: 685
download_size: 1697050577
dataset_size: 1726904053.734306
---
# Dataset Card for "viet_vivos"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
argilla/go_emotions_raw | ---
size_categories: 10K<n<100K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for go_emotions_raw
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
It contains the raw version of [go_emotions](https://huggingface.co/datasets/go_emotions) as a `FeedbackDataset`. Each of the original questions are defined a single
`FeedbackRecord` and contain the `responses` from each annotator. The final labels in the *simplified* version of the dataset have been used as `suggestions`, so that we
can use this dataset to showcase the metrics related to the agreement between annotators as well as the `responses` vs `suggestions` metrics.
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("argilla/go_emotions_raw")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("argilla/go_emotions_raw")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| text | Text | FieldTypes.text | True | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| label | Label | QuestionTypes.multi_label_selection | True | Classify the text by selecting the correct label from the given list of labels. | ['admiration', 'amusement', 'anger', 'annoyance', 'approval', 'caring', 'confusion', 'curiosity', 'desire', 'disappointment', 'disapproval', 'disgust', 'embarrassment', 'excitement', 'fear', 'gratitude', 'grief', 'joy', 'love', 'nervousness', 'optimism', 'pride', 'realization', 'relief', 'remorse', 'sadness', 'surprise', 'neutral'] |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"text": " \"If you don\u0027t wear BROWN AND ORANGE...YOU DON\u0027T MATTER!\" We need a tshirt with that on it asap! "
},
"metadata": {},
"responses": [
{
"status": "submitted",
"user_id": "00000000-0000-0000-0000-000000000001",
"values": {
"label": {
"value": [
"neutral"
]
}
}
},
{
"status": "submitted",
"user_id": "00000000-0000-0000-0000-000000000016",
"values": {
"label": {
"value": [
"anger",
"annoyance",
"optimism"
]
}
}
},
{
"status": "submitted",
"user_id": "00000000-0000-0000-0000-000000000028",
"values": {
"label": {
"value": [
"approval"
]
}
}
},
{
"status": "submitted",
"user_id": "00000000-0000-0000-0000-000000000039",
"values": {
"label": {
"value": [
"neutral"
]
}
}
},
{
"status": "submitted",
"user_id": "00000000-0000-0000-0000-000000000048",
"values": {
"label": {
"value": [
"annoyance"
]
}
}
}
],
"suggestions": [
{
"agent": null,
"question_name": "label",
"score": null,
"type": "human",
"value": [
"annoyance",
"neutral"
]
}
],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": null,
"label": [
{
"status": "submitted",
"user_id": "00000000-0000-0000-0000-000000000001",
"value": [
"neutral"
]
},
{
"status": "submitted",
"user_id": "00000000-0000-0000-0000-000000000016",
"value": [
"anger",
"annoyance",
"optimism"
]
},
{
"status": "submitted",
"user_id": "00000000-0000-0000-0000-000000000028",
"value": [
"approval"
]
},
{
"status": "submitted",
"user_id": "00000000-0000-0000-0000-000000000039",
"value": [
"neutral"
]
},
{
"status": "submitted",
"user_id": "00000000-0000-0000-0000-000000000048",
"value": [
"annoyance"
]
}
],
"label-suggestion": [
"annoyance",
"neutral"
],
"label-suggestion-metadata": {
"agent": null,
"score": null,
"type": "human"
},
"metadata": "{}",
"text": " \"If you don\u0027t wear BROWN AND ORANGE...YOU DON\u0027T MATTER!\" We need a tshirt with that on it asap! "
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **text** is of type `FieldTypes.text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **label** is of type `QuestionTypes.multi_label_selection` with the following allowed values ['admiration', 'amusement', 'anger', 'annoyance', 'approval', 'caring', 'confusion', 'curiosity', 'desire', 'disappointment', 'disapproval', 'disgust', 'embarrassment', 'excitement', 'fear', 'gratitude', 'grief', 'joy', 'love', 'nervousness', 'optimism', 'pride', 'realization', 'relief', 'remorse', 'sadness', 'surprise', 'neutral'], and description "Classify the text by selecting the correct label from the given list of labels.".
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **label-suggestion** is of type `QuestionTypes.multi_label_selection` with the following allowed values ['admiration', 'amusement', 'anger', 'annoyance', 'approval', 'caring', 'confusion', 'curiosity', 'desire', 'disappointment', 'disapproval', 'disgust', 'embarrassment', 'excitement', 'fear', 'gratitude', 'grief', 'joy', 'love', 'nervousness', 'optimism', 'pride', 'realization', 'relief', 'remorse', 'sadness', 'surprise', 'neutral'].
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Script used for the generation
```python
import argilla as rg
from datasets import load_dataset
import uuid
from datasets import concatenate_datasets
ds = load_dataset("go_emotions", "raw", split="train")
ds_prepared = load_dataset("go_emotions")
_CLASS_NAMES = [
"admiration",
"amusement",
"anger",
"annoyance",
"approval",
"caring",
"confusion",
"curiosity",
"desire",
"disappointment",
"disapproval",
"disgust",
"embarrassment",
"excitement",
"fear",
"gratitude",
"grief",
"joy",
"love",
"nervousness",
"optimism",
"pride",
"realization",
"relief",
"remorse",
"sadness",
"surprise",
"neutral",
]
label_to_id = {label: i for i, label in enumerate(_CLASS_NAMES)}
id_to_label = {i: label for i, label in enumerate(_CLASS_NAMES)}
# Concatenate the datasets and transform to pd.DataFrame
ds_prepared = concatenate_datasets([ds_prepared["train"], ds_prepared["validation"], ds_prepared["test"]])
df_prepared = ds_prepared.to_pandas()
# Obtain the final labels as a dict, to later include these as suggestions
labels_prepared = {}
for idx in df_prepared.index:
labels = [id_to_label[label_id] for label_id in df_prepared['labels'][idx]]
labels_prepared[df_prepared['id'][idx]] = labels
# Add labels to the dataset and keep only the relevant columns
def add_labels(ex):
labels = []
for label in _CLASS_NAMES:
if ex[label] == 1:
labels.append(label)
ex["labels"] = labels
return ex
ds = ds.map(add_labels)
df = ds.select_columns(["text", "labels", "rater_id", "id"]).to_pandas()
# Create a FeedbackDataset for text classification
feedback_dataset = rg.FeedbackDataset.for_text_classification(labels=_CLASS_NAMES, multi_label=True)
# Create the records with the original responses, and use as suggestions
# the final labels in the "simplified" go_emotions dataset.
records = []
for text, df_text in df.groupby("text"):
responses = []
for rater_id, df_raters in df_text.groupby("rater_id"):
responses.append(
{
"values": {"label": {"value": df_raters["labels"].iloc[0].tolist()}},
"status": "submitted",
"user_id": uuid.UUID(int=rater_id),
}
)
suggested_labels = labels_prepared.get(df_raters["id"].iloc[0], None)
if not suggested_labels:
continue
suggestion = [
{
"question_name": "label",
"value": suggested_labels,
"type": "human",
}
]
records.append(
rg.FeedbackRecord(
fields={"text": df_raters["text"].iloc[0]},
responses=responses,
suggestions=suggestion
)
)
feedback_dataset.add_records(records)
# Push to the hub
feedback_dataset.push_to_huggingface("plaguss/go_emotions_raw")
```
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
This is a text classification dataset that contains texts and labels. Given a set of texts and a predefined set of labels, the goal of text classification is to assign one or more labels to each text based on its content. Please classify the texts by making the correct selection.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.