datasetId
stringlengths
2
117
card
stringlengths
19
1.01M
evanarlian/common_voice_11_0_id_filtered
--- dataset_info: features: - name: client_id dtype: string - name: path dtype: string - name: audio dtype: audio: sampling_rate: 16000 - name: sentence dtype: string - name: up_votes dtype: int64 - name: down_votes dtype: int64 - name: age dtype: string - name: gender dtype: string - name: accent dtype: string - name: locale dtype: string - name: segment dtype: string splits: - name: train num_bytes: 570693903.7812607 num_examples: 22906 - name: validation num_bytes: 98832914.0 num_examples: 3226 - name: test num_bytes: 112254685.0 num_examples: 3618 - name: other num_bytes: 147132536.35696015 num_examples: 6380 - name: invalidated num_bytes: 63830420.0 num_examples: 2466 download_size: 975354578 dataset_size: 992744459.1382209 --- # Dataset Card for "common_voice_11_0_id_filtered" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
kgr123/quality_mcqa_2
--- dataset_info: features: - name: document_id dtype: string - name: url dtype: string - name: title dtype: string - name: text dtype: string - name: context_orig dtype: string - name: token_soft_limit_deberta dtype: int64 - name: len_soft_limit dtype: int64 - name: context dtype: string - name: questions dtype: string - name: insertion_labels dtype: string - name: query dtype: string - name: option_0 dtype: string - name: option_1 dtype: string - name: option_2 dtype: string - name: option_3 dtype: string - name: label dtype: int64 splits: - name: train num_bytes: 245016586.66436782 num_examples: 1732 - name: validation num_bytes: 52760184.83513513 num_examples: 367 - name: test num_bytes: 52689800.18648649 num_examples: 367 download_size: 145262229 dataset_size: 350466571.68598944 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* ---
phanvancongthanh/data_standardized
--- dataset_info: features: - name: smiles dtype: float64 splits: - name: train num_bytes: 0 num_examples: 0 download_size: 548 dataset_size: 0 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "pubchem_enamine_standardized" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
liuyanchen1015/MULTI_VALUE_stsb_chaining_main_verbs
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: score dtype: float64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: dev num_bytes: 3468 num_examples: 16 - name: test num_bytes: 1671 num_examples: 11 - name: train num_bytes: 4670 num_examples: 30 download_size: 15921 dataset_size: 9809 --- # Dataset Card for "MULTI_VALUE_stsb_chaining_main_verbs" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
WaltonFuture/InstructionGPT-4
--- task_categories: - visual-question-answering size_categories: - n<1K --- # InstructionGPT-4: A 200-Instruction Paradigm for Fine-Tuning MiniGPT-4 [Lai Wei](https://waltonfuture.github.io/), Zihao Jiang, [Weiran Huang](https://www.weiranhuang.com/), [Lichao Sun](https://lichao-sun.github.io/). **Shanghai Jiao Tong University, Lehigh University** [Paper](https://arxiv.org/abs/2308.12067), [Link](https://mp.weixin.qq.com/s/s4Acec71v5oMlFkyhlCL_g), [Code](https://github.com/waltonfuture/InstructionGPT-4) ## Introduction Multimodal large language models acquire their instruction-following capabilities through a two-stage training process: pre-training on image-text pairs and fine-tuning on supervised vision-language instruction data. Recent studies have shown that large language models can achieve satisfactory results even with a limited amount of high-quality instruction-following data. In this paper, we introduce InstructionGPT-4, which is fine-tuned on a small dataset comprising only 200 examples, amounting to approximately 6% of the instruction-following data used in the alignment dataset for MiniGPT-4. We first propose several metrics to access the quality of multimodal instruction data. Based on these metrics, we present a simple and effective data selector to automatically identify and filter low-quality vision-language data. By employing this method, InstructionGPT-4 outperforms the original MiniGPT-4 on various evaluations (e.g., visual question answering, GPT-4 preference). Overall, our findings demonstrate that less but high-quality instruction tuning data is efficient to enable multimodal large language models to generate better output. ## Usage You can download our vision-language dataset containing only 200 high-quality examples and replace the original cc_sbu_align dataset used in the fine-tuning stage of MiniGPT-4. The training settings are the same as [MiniGPT-4](https://github.com/Vision-CAIR/MiniGPT-4). If you're using InstructionGPT-4 in your research or applications, please cite using this BibTeX: ```bibtex @article{wei2023instructiongpt, title={InstructionGPT-4: A 200-Instruction Paradigm for Fine-Tuning MiniGPT-4}, author={Wei, Lai and Jiang, Zihao and Huang, Weiran and Sun, Lichao}, journal={arXiv preprint arXiv:2308.12067}, year={2023} } ```
zolak/twitter_dataset_80_1713135290
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 292521 num_examples: 699 download_size: 140026 dataset_size: 292521 configs: - config_name: default data_files: - split: train path: data/train-* ---
okaris/ucberkeley-dlab-measuring-hate-speech
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: comment_id dtype: int32 - name: annotator_id dtype: int32 - name: platform dtype: int8 - name: sentiment dtype: float64 - name: respect dtype: float64 - name: insult dtype: float64 - name: humiliate dtype: float64 - name: status dtype: float64 - name: dehumanize dtype: float64 - name: violence dtype: float64 - name: genocide dtype: float64 - name: attack_defend dtype: float64 - name: hatespeech dtype: float64 - name: hate_speech_score dtype: float64 - name: text dtype: string - name: infitms dtype: float64 - name: outfitms dtype: float64 - name: annotator_severity dtype: float64 - name: std_err dtype: float64 - name: annotator_infitms dtype: float64 - name: annotator_outfitms dtype: float64 - name: hypothesis dtype: float64 - name: target_race_asian dtype: bool - name: target_race_black dtype: bool - name: target_race_latinx dtype: bool - name: target_race_middle_eastern dtype: bool - name: target_race_native_american dtype: bool - name: target_race_pacific_islander dtype: bool - name: target_race_white dtype: bool - name: target_race_other dtype: bool - name: target_race dtype: bool - name: target_religion_atheist dtype: bool - name: target_religion_buddhist dtype: bool - name: target_religion_christian dtype: bool - name: target_religion_hindu dtype: bool - name: target_religion_jewish dtype: bool - name: target_religion_mormon dtype: bool - name: target_religion_muslim dtype: bool - name: target_religion_other dtype: bool - name: target_religion dtype: bool - name: target_origin_immigrant dtype: bool - name: target_origin_migrant_worker dtype: bool - name: target_origin_specific_country dtype: bool - name: target_origin_undocumented dtype: bool - name: target_origin_other dtype: bool - name: target_origin dtype: bool - name: target_gender_men dtype: bool - name: target_gender_non_binary dtype: bool - name: target_gender_transgender_men dtype: bool - name: target_gender_transgender_unspecified dtype: bool - name: target_gender_transgender_women dtype: bool - name: target_gender_women dtype: bool - name: target_gender_other dtype: bool - name: target_gender dtype: bool - name: target_sexuality_bisexual dtype: bool - name: target_sexuality_gay dtype: bool - name: target_sexuality_lesbian dtype: bool - name: target_sexuality_straight dtype: bool - name: target_sexuality_other dtype: bool - name: target_sexuality dtype: bool - name: target_age_children dtype: bool - name: target_age_teenagers dtype: bool - name: target_age_young_adults dtype: bool - name: target_age_middle_aged dtype: bool - name: target_age_seniors dtype: bool - name: target_age_other dtype: bool - name: target_age dtype: bool - name: target_disability_physical dtype: bool - name: target_disability_cognitive dtype: bool - name: target_disability_neurological dtype: bool - name: target_disability_visually_impaired dtype: bool - name: target_disability_hearing_impaired dtype: bool - name: target_disability_unspecific dtype: bool - name: target_disability_other dtype: bool - name: target_disability dtype: bool - name: annotator_gender dtype: string - name: annotator_trans dtype: string - name: annotator_educ dtype: string - name: annotator_income dtype: string - name: annotator_ideology dtype: string - name: annotator_gender_men dtype: bool - name: annotator_gender_women dtype: bool - name: annotator_gender_non_binary dtype: bool - name: annotator_gender_prefer_not_to_say dtype: bool - name: annotator_gender_self_describe dtype: bool - name: annotator_transgender dtype: bool - name: annotator_cisgender dtype: bool - name: annotator_transgender_prefer_not_to_say dtype: bool - name: annotator_education_some_high_school dtype: bool - name: annotator_education_high_school_grad dtype: bool - name: annotator_education_some_college dtype: bool - name: annotator_education_college_grad_aa dtype: bool - name: annotator_education_college_grad_ba dtype: bool - name: annotator_education_professional_degree dtype: bool - name: annotator_education_masters dtype: bool - name: annotator_education_phd dtype: bool - name: annotator_income_<10k dtype: bool - name: annotator_income_10k-50k dtype: bool - name: annotator_income_50k-100k dtype: bool - name: annotator_income_100k-200k dtype: bool - name: annotator_income_>200k dtype: bool - name: annotator_ideology_extremeley_conservative dtype: bool - name: annotator_ideology_conservative dtype: bool - name: annotator_ideology_slightly_conservative dtype: bool - name: annotator_ideology_neutral dtype: bool - name: annotator_ideology_slightly_liberal dtype: bool - name: annotator_ideology_liberal dtype: bool - name: annotator_ideology_extremeley_liberal dtype: bool - name: annotator_ideology_no_opinion dtype: bool - name: annotator_race_asian dtype: bool - name: annotator_race_black dtype: bool - name: annotator_race_latinx dtype: bool - name: annotator_race_middle_eastern dtype: bool - name: annotator_race_native_american dtype: bool - name: annotator_race_pacific_islander dtype: bool - name: annotator_race_white dtype: bool - name: annotator_race_other dtype: bool - name: annotator_age dtype: float64 - name: annotator_religion_atheist dtype: bool - name: annotator_religion_buddhist dtype: bool - name: annotator_religion_christian dtype: bool - name: annotator_religion_hindu dtype: bool - name: annotator_religion_jewish dtype: bool - name: annotator_religion_mormon dtype: bool - name: annotator_religion_muslim dtype: bool - name: annotator_religion_nothing dtype: bool - name: annotator_religion_other dtype: bool - name: annotator_sexuality_bisexual dtype: bool - name: annotator_sexuality_gay dtype: bool - name: annotator_sexuality_straight dtype: bool - name: annotator_sexuality_other dtype: bool splits: - name: train num_bytes: 52943809 num_examples: 135556 download_size: 19680581 dataset_size: 52943809 --- # Dataset Card for "ucberkeley-dlab-measuring-hate-speech" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
yajingchen/MarketMail-AI-Dataset-test
--- dataset_info: features: - name: product dtype: string - name: description dtype: string - name: marketing_email dtype: string splits: - name: train num_bytes: 5722 num_examples: 5 download_size: 11317 dataset_size: 5722 --- # Dataset Card for "MarketMail-AI-Dataset-test" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
curateIT/themet_openaccess_isPublicDomain
--- license: cc0-1.0 ---
carloswylker/BatistaLima
--- license: openrail ---
open-llm-leaderboard/details_deepseek-ai__deepseek-coder-1.3b-instruct
--- pretty_name: Evaluation run of deepseek-ai/deepseek-coder-1.3b-instruct dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [deepseek-ai/deepseek-coder-1.3b-instruct](https://huggingface.co/deepseek-ai/deepseek-coder-1.3b-instruct)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_deepseek-ai__deepseek-coder-1.3b-instruct\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-12-04T15:02:34.832979](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-coder-1.3b-instruct/blob/main/results_2023-12-04T15-02-34.832979.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.28454514557458804,\n\ \ \"acc_stderr\": 0.031975322223722284,\n \"acc_norm\": 0.2866992057773824,\n\ \ \"acc_norm_stderr\": 0.03278367939837994,\n \"mc1\": 0.2594859241126071,\n\ \ \"mc1_stderr\": 0.015345409485557989,\n \"mc2\": 0.44015243507625756,\n\ \ \"mc2_stderr\": 0.015219908561861553\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.257679180887372,\n \"acc_stderr\": 0.0127807705627684,\n\ \ \"acc_norm\": 0.2858361774744027,\n \"acc_norm_stderr\": 0.013203196088537369\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.33419637522405893,\n\ \ \"acc_stderr\": 0.004707447244200623,\n \"acc_norm\": 0.398725353515236,\n\ \ \"acc_norm_stderr\": 0.0048863535635718545\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n\ \ \"acc_stderr\": 0.03944624162501117,\n \"acc_norm\": 0.2962962962962963,\n\ \ \"acc_norm_stderr\": 0.03944624162501117\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\ \ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\ \ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \ \ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.3018867924528302,\n \"acc_stderr\": 0.02825420034443865,\n\ \ \"acc_norm\": 0.3018867924528302,\n \"acc_norm_stderr\": 0.02825420034443865\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\ \ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\ \ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n\ \ \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \ \ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\ \ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\ \ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\ \ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\ \ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.03025123757921317,\n\ \ \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.03025123757921317\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\ \ \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n\ \ \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.38620689655172413,\n \"acc_stderr\": 0.04057324734419035,\n\ \ \"acc_norm\": 0.38620689655172413,\n \"acc_norm_stderr\": 0.04057324734419035\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.25925925925925924,\n \"acc_stderr\": 0.0225698970749184,\n \"\ acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.0225698970749184\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\ \ \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n\ \ \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\ \ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2806451612903226,\n\ \ \"acc_stderr\": 0.025560604721022884,\n \"acc_norm\": 0.2806451612903226,\n\ \ \"acc_norm_stderr\": 0.025560604721022884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.031447125816782426,\n\ \ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.031447125816782426\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\ : 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.03546563019624336,\n\ \ \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.03546563019624336\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.35858585858585856,\n \"acc_stderr\": 0.03416903640391521,\n \"\ acc_norm\": 0.35858585858585856,\n \"acc_norm_stderr\": 0.03416903640391521\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.32642487046632124,\n \"acc_stderr\": 0.033840286211432945,\n\ \ \"acc_norm\": 0.32642487046632124,\n \"acc_norm_stderr\": 0.033840286211432945\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.2692307692307692,\n \"acc_stderr\": 0.022489389793654824,\n\ \ \"acc_norm\": 0.2692307692307692,\n \"acc_norm_stderr\": 0.022489389793654824\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.24444444444444444,\n \"acc_stderr\": 0.026202766534652148,\n \ \ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.026202766534652148\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.028510251512341933,\n\ \ \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.028510251512341933\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\ : 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\ \ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3174311926605505,\n\ \ \"acc_stderr\": 0.0199571521984605,\n \"acc_norm\": 0.3174311926605505,\n\ \ \"acc_norm_stderr\": 0.0199571521984605\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\ : {\n \"acc\": 0.25462962962962965,\n \"acc_stderr\": 0.02971127586000536,\n\ \ \"acc_norm\": 0.25462962962962965,\n \"acc_norm_stderr\": 0.02971127586000536\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604257,\n \"\ acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604257\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.3080168776371308,\n \"acc_stderr\": 0.0300523893356057,\n \ \ \"acc_norm\": 0.3080168776371308,\n \"acc_norm_stderr\": 0.0300523893356057\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.14798206278026907,\n\ \ \"acc_stderr\": 0.023831557157613537,\n \"acc_norm\": 0.14798206278026907,\n\ \ \"acc_norm_stderr\": 0.023831557157613537\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.3511450381679389,\n \"acc_stderr\": 0.04186445163013751,\n\ \ \"acc_norm\": 0.3511450381679389,\n \"acc_norm_stderr\": 0.04186445163013751\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.2892561983471074,\n \"acc_stderr\": 0.04139112727635464,\n \"\ acc_norm\": 0.2892561983471074,\n \"acc_norm_stderr\": 0.04139112727635464\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\ \ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \ \ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.035590395316173425,\n\ \ \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.035590395316173425\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\ \ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\ \ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258973,\n\ \ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258973\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3333333333333333,\n\ \ \"acc_stderr\": 0.03088273697413866,\n \"acc_norm\": 0.3333333333333333,\n\ \ \"acc_norm_stderr\": 0.03088273697413866\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.29246487867177523,\n\ \ \"acc_stderr\": 0.01626700068459864,\n \"acc_norm\": 0.29246487867177523,\n\ \ \"acc_norm_stderr\": 0.01626700068459864\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.28034682080924855,\n \"acc_stderr\": 0.024182427496577605,\n\ \ \"acc_norm\": 0.28034682080924855,\n \"acc_norm_stderr\": 0.024182427496577605\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\ \ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\ \ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.026336613469046637,\n\ \ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.026336613469046637\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.29260450160771706,\n\ \ \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.29260450160771706,\n\ \ \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.02517104191530968,\n\ \ \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.02517104191530968\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.24468085106382978,\n \"acc_stderr\": 0.025645553622266722,\n \ \ \"acc_norm\": 0.24468085106382978,\n \"acc_norm_stderr\": 0.025645553622266722\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2900912646675359,\n\ \ \"acc_stderr\": 0.011590375554733095,\n \"acc_norm\": 0.2900912646675359,\n\ \ \"acc_norm_stderr\": 0.011590375554733095\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.41544117647058826,\n \"acc_stderr\": 0.029935342707877746,\n\ \ \"acc_norm\": 0.41544117647058826,\n \"acc_norm_stderr\": 0.029935342707877746\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.24183006535947713,\n \"acc_stderr\": 0.017322789207784326,\n \ \ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.017322789207784326\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n\ \ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.33636363636363636,\n\ \ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.2530612244897959,\n \"acc_stderr\": 0.027833023871399694,\n\ \ \"acc_norm\": 0.2530612244897959,\n \"acc_norm_stderr\": 0.027833023871399694\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3383084577114428,\n\ \ \"acc_stderr\": 0.03345563070339193,\n \"acc_norm\": 0.3383084577114428,\n\ \ \"acc_norm_stderr\": 0.03345563070339193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n\ \ \"acc_stderr\": 0.03384429155233137,\n \"acc_norm\": 0.25301204819277107,\n\ \ \"acc_norm_stderr\": 0.03384429155233137\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.21637426900584794,\n \"acc_stderr\": 0.03158149539338734,\n\ \ \"acc_norm\": 0.21637426900584794,\n \"acc_norm_stderr\": 0.03158149539338734\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2594859241126071,\n\ \ \"mc1_stderr\": 0.015345409485557989,\n \"mc2\": 0.44015243507625756,\n\ \ \"mc2_stderr\": 0.015219908561861553\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.5240726124704025,\n \"acc_stderr\": 0.01403618966539513\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01061410159211524,\n \ \ \"acc_stderr\": 0.002822713322387704\n }\n}\n```" repo_url: https://huggingface.co/deepseek-ai/deepseek-coder-1.3b-instruct leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|arc:challenge|25_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-12-04T15-02-34.832979.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|gsm8k|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hellaswag|10_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-04T15-02-34.832979.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-management|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-virology|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-02-34.832979.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|truthfulqa:mc|0_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-12-04T15-02-34.832979.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_12_04T15_02_34.832979 path: - '**/details_harness|winogrande|5_2023-12-04T15-02-34.832979.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-12-04T15-02-34.832979.parquet' - config_name: results data_files: - split: 2023_12_04T15_02_34.832979 path: - results_2023-12-04T15-02-34.832979.parquet - split: latest path: - results_2023-12-04T15-02-34.832979.parquet --- # Dataset Card for Evaluation run of deepseek-ai/deepseek-coder-1.3b-instruct ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/deepseek-ai/deepseek-coder-1.3b-instruct - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [deepseek-ai/deepseek-coder-1.3b-instruct](https://huggingface.co/deepseek-ai/deepseek-coder-1.3b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_deepseek-ai__deepseek-coder-1.3b-instruct", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T15:02:34.832979](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-coder-1.3b-instruct/blob/main/results_2023-12-04T15-02-34.832979.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.28454514557458804, "acc_stderr": 0.031975322223722284, "acc_norm": 0.2866992057773824, "acc_norm_stderr": 0.03278367939837994, "mc1": 0.2594859241126071, "mc1_stderr": 0.015345409485557989, "mc2": 0.44015243507625756, "mc2_stderr": 0.015219908561861553 }, "harness|arc:challenge|25": { "acc": 0.257679180887372, "acc_stderr": 0.0127807705627684, "acc_norm": 0.2858361774744027, "acc_norm_stderr": 0.013203196088537369 }, "harness|hellaswag|10": { "acc": 0.33419637522405893, "acc_stderr": 0.004707447244200623, "acc_norm": 0.398725353515236, "acc_norm_stderr": 0.0048863535635718545 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.2962962962962963, "acc_stderr": 0.03944624162501117, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.03944624162501117 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.19736842105263158, "acc_stderr": 0.03238981601699397, "acc_norm": 0.19736842105263158, "acc_norm_stderr": 0.03238981601699397 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.3018867924528302, "acc_stderr": 0.02825420034443865, "acc_norm": 0.3018867924528302, "acc_norm_stderr": 0.02825420034443865 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.29, "acc_stderr": 0.04560480215720683, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720683 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.23, "acc_stderr": 0.042295258468165044, "acc_norm": 0.23, "acc_norm_stderr": 0.042295258468165044 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.24277456647398843, "acc_stderr": 0.0326926380614177, "acc_norm": 0.24277456647398843, "acc_norm_stderr": 0.0326926380614177 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.2549019607843137, "acc_stderr": 0.043364327079931785, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.043364327079931785 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.31063829787234043, "acc_stderr": 0.03025123757921317, "acc_norm": 0.31063829787234043, "acc_norm_stderr": 0.03025123757921317 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.24561403508771928, "acc_stderr": 0.040493392977481425, "acc_norm": 0.24561403508771928, "acc_norm_stderr": 0.040493392977481425 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.38620689655172413, "acc_stderr": 0.04057324734419035, "acc_norm": 0.38620689655172413, "acc_norm_stderr": 0.04057324734419035 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.25925925925925924, "acc_stderr": 0.0225698970749184, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.0225698970749184 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.21428571428571427, "acc_stderr": 0.03670066451047181, "acc_norm": 0.21428571428571427, "acc_norm_stderr": 0.03670066451047181 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.2806451612903226, "acc_stderr": 0.025560604721022884, "acc_norm": 0.2806451612903226, "acc_norm_stderr": 0.025560604721022884 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.27586206896551724, "acc_stderr": 0.031447125816782426, "acc_norm": 0.27586206896551724, "acc_norm_stderr": 0.031447125816782426 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.2909090909090909, "acc_stderr": 0.03546563019624336, "acc_norm": 0.2909090909090909, "acc_norm_stderr": 0.03546563019624336 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.35858585858585856, "acc_stderr": 0.03416903640391521, "acc_norm": 0.35858585858585856, "acc_norm_stderr": 0.03416903640391521 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.32642487046632124, "acc_stderr": 0.033840286211432945, "acc_norm": 0.32642487046632124, "acc_norm_stderr": 0.033840286211432945 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2692307692307692, "acc_stderr": 0.022489389793654824, "acc_norm": 0.2692307692307692, "acc_norm_stderr": 0.022489389793654824 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24444444444444444, "acc_stderr": 0.026202766534652148, "acc_norm": 0.24444444444444444, "acc_norm_stderr": 0.026202766534652148 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2605042016806723, "acc_stderr": 0.028510251512341933, "acc_norm": 0.2605042016806723, "acc_norm_stderr": 0.028510251512341933 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.304635761589404, "acc_stderr": 0.03757949922943343, "acc_norm": 0.304635761589404, "acc_norm_stderr": 0.03757949922943343 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.3174311926605505, "acc_stderr": 0.0199571521984605, "acc_norm": 0.3174311926605505, "acc_norm_stderr": 0.0199571521984605 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.25462962962962965, "acc_stderr": 0.02971127586000536, "acc_norm": 0.25462962962962965, "acc_norm_stderr": 0.02971127586000536 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.2549019607843137, "acc_stderr": 0.030587591351604257, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.030587591351604257 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.3080168776371308, "acc_stderr": 0.0300523893356057, "acc_norm": 0.3080168776371308, "acc_norm_stderr": 0.0300523893356057 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.14798206278026907, "acc_stderr": 0.023831557157613537, "acc_norm": 0.14798206278026907, "acc_norm_stderr": 0.023831557157613537 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.3511450381679389, "acc_stderr": 0.04186445163013751, "acc_norm": 0.3511450381679389, "acc_norm_stderr": 0.04186445163013751 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2892561983471074, "acc_stderr": 0.04139112727635464, "acc_norm": 0.2892561983471074, "acc_norm_stderr": 0.04139112727635464 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25, "acc_stderr": 0.04186091791394607, "acc_norm": 0.25, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.2883435582822086, "acc_stderr": 0.035590395316173425, "acc_norm": 0.2883435582822086, "acc_norm_stderr": 0.035590395316173425 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.32142857142857145, "acc_stderr": 0.04432804055291519, "acc_norm": 0.32142857142857145, "acc_norm_stderr": 0.04432804055291519 }, "harness|hendrycksTest-management|5": { "acc": 0.3786407766990291, "acc_stderr": 0.04802694698258973, "acc_norm": 0.3786407766990291, "acc_norm_stderr": 0.04802694698258973 }, "harness|hendrycksTest-marketing|5": { "acc": 0.3333333333333333, "acc_stderr": 0.03088273697413866, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.03088273697413866 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.29246487867177523, "acc_stderr": 0.01626700068459864, "acc_norm": 0.29246487867177523, "acc_norm_stderr": 0.01626700068459864 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.28034682080924855, "acc_stderr": 0.024182427496577605, "acc_norm": 0.28034682080924855, "acc_norm_stderr": 0.024182427496577605 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.27262569832402234, "acc_stderr": 0.014893391735249588, "acc_norm": 0.27262569832402234, "acc_norm_stderr": 0.014893391735249588 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.30392156862745096, "acc_stderr": 0.026336613469046637, "acc_norm": 0.30392156862745096, "acc_norm_stderr": 0.026336613469046637 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.29260450160771706, "acc_stderr": 0.025839898334877983, "acc_norm": 0.29260450160771706, "acc_norm_stderr": 0.025839898334877983 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.28703703703703703, "acc_stderr": 0.02517104191530968, "acc_norm": 0.28703703703703703, "acc_norm_stderr": 0.02517104191530968 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.24468085106382978, "acc_stderr": 0.025645553622266722, "acc_norm": 0.24468085106382978, "acc_norm_stderr": 0.025645553622266722 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2900912646675359, "acc_stderr": 0.011590375554733095, "acc_norm": 0.2900912646675359, "acc_norm_stderr": 0.011590375554733095 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.41544117647058826, "acc_stderr": 0.029935342707877746, "acc_norm": 0.41544117647058826, "acc_norm_stderr": 0.029935342707877746 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.24183006535947713, "acc_stderr": 0.017322789207784326, "acc_norm": 0.24183006535947713, "acc_norm_stderr": 0.017322789207784326 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.33636363636363636, "acc_stderr": 0.04525393596302506, "acc_norm": 0.33636363636363636, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.2530612244897959, "acc_stderr": 0.027833023871399694, "acc_norm": 0.2530612244897959, "acc_norm_stderr": 0.027833023871399694 }, "harness|hendrycksTest-sociology|5": { "acc": 0.3383084577114428, "acc_stderr": 0.03345563070339193, "acc_norm": 0.3383084577114428, "acc_norm_stderr": 0.03345563070339193 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-virology|5": { "acc": 0.25301204819277107, "acc_stderr": 0.03384429155233137, "acc_norm": 0.25301204819277107, "acc_norm_stderr": 0.03384429155233137 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.21637426900584794, "acc_stderr": 0.03158149539338734, "acc_norm": 0.21637426900584794, "acc_norm_stderr": 0.03158149539338734 }, "harness|truthfulqa:mc|0": { "mc1": 0.2594859241126071, "mc1_stderr": 0.015345409485557989, "mc2": 0.44015243507625756, "mc2_stderr": 0.015219908561861553 }, "harness|winogrande|5": { "acc": 0.5240726124704025, "acc_stderr": 0.01403618966539513 }, "harness|gsm8k|5": { "acc": 0.01061410159211524, "acc_stderr": 0.002822713322387704 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
totally-not-an-llm/EverythingLM-data
--- license: mit --- # EverythingLM Dataset **EverythingLM** is a diverse instruct dataset consisting of ~1k sets of system prompts, instructions, and corresponding responses. These sets were generated using principles from both evol-instruct and Orca. The dataset encompasses a wide array of topics and interactions. ### Categories: - Reasoning - Creative Writing - General Knowledge - Brainstorming - Search Query - Coding - Basic Instruct We also leverage various system prompts for evol-instruct and for responding to prompts. This dataset has also been filtered to remove OpenAI alignment. ### How it stands out: - Long, detailed outputs - Humanlike creativity - CoT reasoning - Complex & challenging tasks ### Plans: - Train Llama 7b & 13b models - Train Llama 70b QLoRA - Generate V2 of the dataset, with more categories and GPT-4 ### How does it work? 1. Generate list of categories, prompts, sysprompts, etc (human) 2. Generate seed prompts (GPT) 3. Evolve prompts (GPT) 4. Generate responses (GPT) 5. Convert to Alpaca dataset format Included in this repo is the script to generate the dataset. However, it is buggy and probably not the best implementation possible.
juliensimon/autotrain-data-chest-xray-demo
--- task_categories: - image-classification --- # AutoTrain Dataset for project: chest-xray-demo ## Dataset Description This dataset has been automatically processed by AutoTrain for project chest-xray-demo. The original dataset is located at https://www.kaggle.com/datasets/paultimothymooney/chest-xray-pneumonia ## Dataset Structure ``` ├── train │   ├── NORMAL │   └── PNEUMONIA └── valid ├── NORMAL └── PNEUMONIA ``` ### Data Instances A sample from this dataset looks as follows: ```json [ { "image": "<2090x1858 L PIL image>", "target": 0 }, { "image": "<1422x1152 L PIL image>", "target": 0 } ] ``` ### Dataset Fields The dataset has the following fields (also called "features"): ```json { "image": "Image(decode=True, id=None)", "target": "ClassLabel(num_classes=2, names=['NORMAL', 'PNEUMONIA'], id=None)" } ``` ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follows: | Split name | Num samples | | ------------ | ------------------- | | train | 5216 | | valid | 624 |
Ar4ikov/sd_filtered_2m
--- dataset_info: features: - name: Prompt dtype: string splits: - name: train num_bytes: 427667829.2266251 num_examples: 2672923 - name: test num_bytes: 47018271.06645638 num_examples: 296922 download_size: 364684829 dataset_size: 474686100.29308146 --- # Dataset Card for "sd_filtered_2m" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
HuggingFaceM4/IIIT-5K-classif-Sample
Invalid username or password.
BangumiBase/gotoubunnohanayome
--- license: mit tags: - art size_categories: - 10K<n<100K --- # Bangumi Image Base of Gotoubun No Hanayome This is the image base of bangumi Gotoubun no Hanayome, we detected 134 characters, 16632 images in total. The full dataset is [here](all.zip). **Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview: | # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 | |:------|---------:|:----------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------| | 0 | 2366 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) | | 1 | 432 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) | | 2 | 144 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | ![preview 7](2/preview_7.png) | ![preview 8](2/preview_8.png) | | 3 | 182 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | ![preview 7](3/preview_7.png) | ![preview 8](3/preview_8.png) | | 4 | 221 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | ![preview 8](4/preview_8.png) | | 5 | 73 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) | | 6 | 35 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) | | 7 | 42 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | ![preview 8](7/preview_8.png) | | 8 | 61 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) | | 9 | 17 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) | | 10 | 15 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) | | 11 | 61 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | ![preview 8](11/preview_8.png) | | 12 | 22 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) | | 13 | 17 | [Download](13/dataset.zip) | ![preview 1](13/preview_1.png) | ![preview 2](13/preview_2.png) | ![preview 3](13/preview_3.png) | ![preview 4](13/preview_4.png) | ![preview 5](13/preview_5.png) | ![preview 6](13/preview_6.png) | ![preview 7](13/preview_7.png) | ![preview 8](13/preview_8.png) | | 14 | 88 | [Download](14/dataset.zip) | ![preview 1](14/preview_1.png) | ![preview 2](14/preview_2.png) | ![preview 3](14/preview_3.png) | ![preview 4](14/preview_4.png) | ![preview 5](14/preview_5.png) | ![preview 6](14/preview_6.png) | ![preview 7](14/preview_7.png) | ![preview 8](14/preview_8.png) | | 15 | 41 | [Download](15/dataset.zip) | ![preview 1](15/preview_1.png) | ![preview 2](15/preview_2.png) | ![preview 3](15/preview_3.png) | ![preview 4](15/preview_4.png) | ![preview 5](15/preview_5.png) | ![preview 6](15/preview_6.png) | ![preview 7](15/preview_7.png) | ![preview 8](15/preview_8.png) | | 16 | 50 | [Download](16/dataset.zip) | ![preview 1](16/preview_1.png) | ![preview 2](16/preview_2.png) | ![preview 3](16/preview_3.png) | ![preview 4](16/preview_4.png) | ![preview 5](16/preview_5.png) | ![preview 6](16/preview_6.png) | ![preview 7](16/preview_7.png) | ![preview 8](16/preview_8.png) | | 17 | 51 | [Download](17/dataset.zip) | ![preview 1](17/preview_1.png) | ![preview 2](17/preview_2.png) | ![preview 3](17/preview_3.png) | ![preview 4](17/preview_4.png) | ![preview 5](17/preview_5.png) | ![preview 6](17/preview_6.png) | ![preview 7](17/preview_7.png) | ![preview 8](17/preview_8.png) | | 18 | 63 | [Download](18/dataset.zip) | ![preview 1](18/preview_1.png) | ![preview 2](18/preview_2.png) | ![preview 3](18/preview_3.png) | ![preview 4](18/preview_4.png) | ![preview 5](18/preview_5.png) | ![preview 6](18/preview_6.png) | ![preview 7](18/preview_7.png) | ![preview 8](18/preview_8.png) | | 19 | 30 | [Download](19/dataset.zip) | ![preview 1](19/preview_1.png) | ![preview 2](19/preview_2.png) | ![preview 3](19/preview_3.png) | ![preview 4](19/preview_4.png) | ![preview 5](19/preview_5.png) | ![preview 6](19/preview_6.png) | ![preview 7](19/preview_7.png) | ![preview 8](19/preview_8.png) | | 20 | 49 | [Download](20/dataset.zip) | ![preview 1](20/preview_1.png) | ![preview 2](20/preview_2.png) | ![preview 3](20/preview_3.png) | ![preview 4](20/preview_4.png) | ![preview 5](20/preview_5.png) | ![preview 6](20/preview_6.png) | ![preview 7](20/preview_7.png) | ![preview 8](20/preview_8.png) | | 21 | 59 | [Download](21/dataset.zip) | ![preview 1](21/preview_1.png) | ![preview 2](21/preview_2.png) | ![preview 3](21/preview_3.png) | ![preview 4](21/preview_4.png) | ![preview 5](21/preview_5.png) | ![preview 6](21/preview_6.png) | ![preview 7](21/preview_7.png) | ![preview 8](21/preview_8.png) | | 22 | 8 | [Download](22/dataset.zip) | ![preview 1](22/preview_1.png) | ![preview 2](22/preview_2.png) | ![preview 3](22/preview_3.png) | ![preview 4](22/preview_4.png) | ![preview 5](22/preview_5.png) | ![preview 6](22/preview_6.png) | ![preview 7](22/preview_7.png) | ![preview 8](22/preview_8.png) | | 23 | 18 | [Download](23/dataset.zip) | ![preview 1](23/preview_1.png) | ![preview 2](23/preview_2.png) | ![preview 3](23/preview_3.png) | ![preview 4](23/preview_4.png) | ![preview 5](23/preview_5.png) | ![preview 6](23/preview_6.png) | ![preview 7](23/preview_7.png) | ![preview 8](23/preview_8.png) | | 24 | 13 | [Download](24/dataset.zip) | ![preview 1](24/preview_1.png) | ![preview 2](24/preview_2.png) | ![preview 3](24/preview_3.png) | ![preview 4](24/preview_4.png) | ![preview 5](24/preview_5.png) | ![preview 6](24/preview_6.png) | ![preview 7](24/preview_7.png) | ![preview 8](24/preview_8.png) | | 25 | 33 | [Download](25/dataset.zip) | ![preview 1](25/preview_1.png) | ![preview 2](25/preview_2.png) | ![preview 3](25/preview_3.png) | ![preview 4](25/preview_4.png) | ![preview 5](25/preview_5.png) | ![preview 6](25/preview_6.png) | ![preview 7](25/preview_7.png) | ![preview 8](25/preview_8.png) | | 26 | 30 | [Download](26/dataset.zip) | ![preview 1](26/preview_1.png) | ![preview 2](26/preview_2.png) | ![preview 3](26/preview_3.png) | ![preview 4](26/preview_4.png) | ![preview 5](26/preview_5.png) | ![preview 6](26/preview_6.png) | ![preview 7](26/preview_7.png) | ![preview 8](26/preview_8.png) | | 27 | 28 | [Download](27/dataset.zip) | ![preview 1](27/preview_1.png) | ![preview 2](27/preview_2.png) | ![preview 3](27/preview_3.png) | ![preview 4](27/preview_4.png) | ![preview 5](27/preview_5.png) | ![preview 6](27/preview_6.png) | ![preview 7](27/preview_7.png) | ![preview 8](27/preview_8.png) | | 28 | 26 | [Download](28/dataset.zip) | ![preview 1](28/preview_1.png) | ![preview 2](28/preview_2.png) | ![preview 3](28/preview_3.png) | ![preview 4](28/preview_4.png) | ![preview 5](28/preview_5.png) | ![preview 6](28/preview_6.png) | ![preview 7](28/preview_7.png) | ![preview 8](28/preview_8.png) | | 29 | 140 | [Download](29/dataset.zip) | ![preview 1](29/preview_1.png) | ![preview 2](29/preview_2.png) | ![preview 3](29/preview_3.png) | ![preview 4](29/preview_4.png) | ![preview 5](29/preview_5.png) | ![preview 6](29/preview_6.png) | ![preview 7](29/preview_7.png) | ![preview 8](29/preview_8.png) | | 30 | 148 | [Download](30/dataset.zip) | ![preview 1](30/preview_1.png) | ![preview 2](30/preview_2.png) | ![preview 3](30/preview_3.png) | ![preview 4](30/preview_4.png) | ![preview 5](30/preview_5.png) | ![preview 6](30/preview_6.png) | ![preview 7](30/preview_7.png) | ![preview 8](30/preview_8.png) | | 31 | 22 | [Download](31/dataset.zip) | ![preview 1](31/preview_1.png) | ![preview 2](31/preview_2.png) | ![preview 3](31/preview_3.png) | ![preview 4](31/preview_4.png) | ![preview 5](31/preview_5.png) | ![preview 6](31/preview_6.png) | ![preview 7](31/preview_7.png) | ![preview 8](31/preview_8.png) | | 32 | 29 | [Download](32/dataset.zip) | ![preview 1](32/preview_1.png) | ![preview 2](32/preview_2.png) | ![preview 3](32/preview_3.png) | ![preview 4](32/preview_4.png) | ![preview 5](32/preview_5.png) | ![preview 6](32/preview_6.png) | ![preview 7](32/preview_7.png) | ![preview 8](32/preview_8.png) | | 33 | 27 | [Download](33/dataset.zip) | ![preview 1](33/preview_1.png) | ![preview 2](33/preview_2.png) | ![preview 3](33/preview_3.png) | ![preview 4](33/preview_4.png) | ![preview 5](33/preview_5.png) | ![preview 6](33/preview_6.png) | ![preview 7](33/preview_7.png) | ![preview 8](33/preview_8.png) | | 34 | 20 | [Download](34/dataset.zip) | ![preview 1](34/preview_1.png) | ![preview 2](34/preview_2.png) | ![preview 3](34/preview_3.png) | ![preview 4](34/preview_4.png) | ![preview 5](34/preview_5.png) | ![preview 6](34/preview_6.png) | ![preview 7](34/preview_7.png) | ![preview 8](34/preview_8.png) | | 35 | 30 | [Download](35/dataset.zip) | ![preview 1](35/preview_1.png) | ![preview 2](35/preview_2.png) | ![preview 3](35/preview_3.png) | ![preview 4](35/preview_4.png) | ![preview 5](35/preview_5.png) | ![preview 6](35/preview_6.png) | ![preview 7](35/preview_7.png) | ![preview 8](35/preview_8.png) | | 36 | 50 | [Download](36/dataset.zip) | ![preview 1](36/preview_1.png) | ![preview 2](36/preview_2.png) | ![preview 3](36/preview_3.png) | ![preview 4](36/preview_4.png) | ![preview 5](36/preview_5.png) | ![preview 6](36/preview_6.png) | ![preview 7](36/preview_7.png) | ![preview 8](36/preview_8.png) | | 37 | 1583 | [Download](37/dataset.zip) | ![preview 1](37/preview_1.png) | ![preview 2](37/preview_2.png) | ![preview 3](37/preview_3.png) | ![preview 4](37/preview_4.png) | ![preview 5](37/preview_5.png) | ![preview 6](37/preview_6.png) | ![preview 7](37/preview_7.png) | ![preview 8](37/preview_8.png) | | 38 | 113 | [Download](38/dataset.zip) | ![preview 1](38/preview_1.png) | ![preview 2](38/preview_2.png) | ![preview 3](38/preview_3.png) | ![preview 4](38/preview_4.png) | ![preview 5](38/preview_5.png) | ![preview 6](38/preview_6.png) | ![preview 7](38/preview_7.png) | ![preview 8](38/preview_8.png) | | 39 | 41 | [Download](39/dataset.zip) | ![preview 1](39/preview_1.png) | ![preview 2](39/preview_2.png) | ![preview 3](39/preview_3.png) | ![preview 4](39/preview_4.png) | ![preview 5](39/preview_5.png) | ![preview 6](39/preview_6.png) | ![preview 7](39/preview_7.png) | ![preview 8](39/preview_8.png) | | 40 | 8 | [Download](40/dataset.zip) | ![preview 1](40/preview_1.png) | ![preview 2](40/preview_2.png) | ![preview 3](40/preview_3.png) | ![preview 4](40/preview_4.png) | ![preview 5](40/preview_5.png) | ![preview 6](40/preview_6.png) | ![preview 7](40/preview_7.png) | ![preview 8](40/preview_8.png) | | 41 | 1898 | [Download](41/dataset.zip) | ![preview 1](41/preview_1.png) | ![preview 2](41/preview_2.png) | ![preview 3](41/preview_3.png) | ![preview 4](41/preview_4.png) | ![preview 5](41/preview_5.png) | ![preview 6](41/preview_6.png) | ![preview 7](41/preview_7.png) | ![preview 8](41/preview_8.png) | | 42 | 1866 | [Download](42/dataset.zip) | ![preview 1](42/preview_1.png) | ![preview 2](42/preview_2.png) | ![preview 3](42/preview_3.png) | ![preview 4](42/preview_4.png) | ![preview 5](42/preview_5.png) | ![preview 6](42/preview_6.png) | ![preview 7](42/preview_7.png) | ![preview 8](42/preview_8.png) | | 43 | 32 | [Download](43/dataset.zip) | ![preview 1](43/preview_1.png) | ![preview 2](43/preview_2.png) | ![preview 3](43/preview_3.png) | ![preview 4](43/preview_4.png) | ![preview 5](43/preview_5.png) | ![preview 6](43/preview_6.png) | ![preview 7](43/preview_7.png) | ![preview 8](43/preview_8.png) | | 44 | 8 | [Download](44/dataset.zip) | ![preview 1](44/preview_1.png) | ![preview 2](44/preview_2.png) | ![preview 3](44/preview_3.png) | ![preview 4](44/preview_4.png) | ![preview 5](44/preview_5.png) | ![preview 6](44/preview_6.png) | ![preview 7](44/preview_7.png) | ![preview 8](44/preview_8.png) | | 45 | 60 | [Download](45/dataset.zip) | ![preview 1](45/preview_1.png) | ![preview 2](45/preview_2.png) | ![preview 3](45/preview_3.png) | ![preview 4](45/preview_4.png) | ![preview 5](45/preview_5.png) | ![preview 6](45/preview_6.png) | ![preview 7](45/preview_7.png) | ![preview 8](45/preview_8.png) | | 46 | 17 | [Download](46/dataset.zip) | ![preview 1](46/preview_1.png) | ![preview 2](46/preview_2.png) | ![preview 3](46/preview_3.png) | ![preview 4](46/preview_4.png) | ![preview 5](46/preview_5.png) | ![preview 6](46/preview_6.png) | ![preview 7](46/preview_7.png) | ![preview 8](46/preview_8.png) | | 47 | 1752 | [Download](47/dataset.zip) | ![preview 1](47/preview_1.png) | ![preview 2](47/preview_2.png) | ![preview 3](47/preview_3.png) | ![preview 4](47/preview_4.png) | ![preview 5](47/preview_5.png) | ![preview 6](47/preview_6.png) | ![preview 7](47/preview_7.png) | ![preview 8](47/preview_8.png) | | 48 | 41 | [Download](48/dataset.zip) | ![preview 1](48/preview_1.png) | ![preview 2](48/preview_2.png) | ![preview 3](48/preview_3.png) | ![preview 4](48/preview_4.png) | ![preview 5](48/preview_5.png) | ![preview 6](48/preview_6.png) | ![preview 7](48/preview_7.png) | ![preview 8](48/preview_8.png) | | 49 | 61 | [Download](49/dataset.zip) | ![preview 1](49/preview_1.png) | ![preview 2](49/preview_2.png) | ![preview 3](49/preview_3.png) | ![preview 4](49/preview_4.png) | ![preview 5](49/preview_5.png) | ![preview 6](49/preview_6.png) | ![preview 7](49/preview_7.png) | ![preview 8](49/preview_8.png) | | 50 | 29 | [Download](50/dataset.zip) | ![preview 1](50/preview_1.png) | ![preview 2](50/preview_2.png) | ![preview 3](50/preview_3.png) | ![preview 4](50/preview_4.png) | ![preview 5](50/preview_5.png) | ![preview 6](50/preview_6.png) | ![preview 7](50/preview_7.png) | ![preview 8](50/preview_8.png) | | 51 | 150 | [Download](51/dataset.zip) | ![preview 1](51/preview_1.png) | ![preview 2](51/preview_2.png) | ![preview 3](51/preview_3.png) | ![preview 4](51/preview_4.png) | ![preview 5](51/preview_5.png) | ![preview 6](51/preview_6.png) | ![preview 7](51/preview_7.png) | ![preview 8](51/preview_8.png) | | 52 | 20 | [Download](52/dataset.zip) | ![preview 1](52/preview_1.png) | ![preview 2](52/preview_2.png) | ![preview 3](52/preview_3.png) | ![preview 4](52/preview_4.png) | ![preview 5](52/preview_5.png) | ![preview 6](52/preview_6.png) | ![preview 7](52/preview_7.png) | ![preview 8](52/preview_8.png) | | 53 | 282 | [Download](53/dataset.zip) | ![preview 1](53/preview_1.png) | ![preview 2](53/preview_2.png) | ![preview 3](53/preview_3.png) | ![preview 4](53/preview_4.png) | ![preview 5](53/preview_5.png) | ![preview 6](53/preview_6.png) | ![preview 7](53/preview_7.png) | ![preview 8](53/preview_8.png) | | 54 | 19 | [Download](54/dataset.zip) | ![preview 1](54/preview_1.png) | ![preview 2](54/preview_2.png) | ![preview 3](54/preview_3.png) | ![preview 4](54/preview_4.png) | ![preview 5](54/preview_5.png) | ![preview 6](54/preview_6.png) | ![preview 7](54/preview_7.png) | ![preview 8](54/preview_8.png) | | 55 | 1843 | [Download](55/dataset.zip) | ![preview 1](55/preview_1.png) | ![preview 2](55/preview_2.png) | ![preview 3](55/preview_3.png) | ![preview 4](55/preview_4.png) | ![preview 5](55/preview_5.png) | ![preview 6](55/preview_6.png) | ![preview 7](55/preview_7.png) | ![preview 8](55/preview_8.png) | | 56 | 52 | [Download](56/dataset.zip) | ![preview 1](56/preview_1.png) | ![preview 2](56/preview_2.png) | ![preview 3](56/preview_3.png) | ![preview 4](56/preview_4.png) | ![preview 5](56/preview_5.png) | ![preview 6](56/preview_6.png) | ![preview 7](56/preview_7.png) | ![preview 8](56/preview_8.png) | | 57 | 22 | [Download](57/dataset.zip) | ![preview 1](57/preview_1.png) | ![preview 2](57/preview_2.png) | ![preview 3](57/preview_3.png) | ![preview 4](57/preview_4.png) | ![preview 5](57/preview_5.png) | ![preview 6](57/preview_6.png) | ![preview 7](57/preview_7.png) | ![preview 8](57/preview_8.png) | | 58 | 15 | [Download](58/dataset.zip) | ![preview 1](58/preview_1.png) | ![preview 2](58/preview_2.png) | ![preview 3](58/preview_3.png) | ![preview 4](58/preview_4.png) | ![preview 5](58/preview_5.png) | ![preview 6](58/preview_6.png) | ![preview 7](58/preview_7.png) | ![preview 8](58/preview_8.png) | | 59 | 78 | [Download](59/dataset.zip) | ![preview 1](59/preview_1.png) | ![preview 2](59/preview_2.png) | ![preview 3](59/preview_3.png) | ![preview 4](59/preview_4.png) | ![preview 5](59/preview_5.png) | ![preview 6](59/preview_6.png) | ![preview 7](59/preview_7.png) | ![preview 8](59/preview_8.png) | | 60 | 20 | [Download](60/dataset.zip) | ![preview 1](60/preview_1.png) | ![preview 2](60/preview_2.png) | ![preview 3](60/preview_3.png) | ![preview 4](60/preview_4.png) | ![preview 5](60/preview_5.png) | ![preview 6](60/preview_6.png) | ![preview 7](60/preview_7.png) | ![preview 8](60/preview_8.png) | | 61 | 18 | [Download](61/dataset.zip) | ![preview 1](61/preview_1.png) | ![preview 2](61/preview_2.png) | ![preview 3](61/preview_3.png) | ![preview 4](61/preview_4.png) | ![preview 5](61/preview_5.png) | ![preview 6](61/preview_6.png) | ![preview 7](61/preview_7.png) | ![preview 8](61/preview_8.png) | | 62 | 29 | [Download](62/dataset.zip) | ![preview 1](62/preview_1.png) | ![preview 2](62/preview_2.png) | ![preview 3](62/preview_3.png) | ![preview 4](62/preview_4.png) | ![preview 5](62/preview_5.png) | ![preview 6](62/preview_6.png) | ![preview 7](62/preview_7.png) | ![preview 8](62/preview_8.png) | | 63 | 15 | [Download](63/dataset.zip) | ![preview 1](63/preview_1.png) | ![preview 2](63/preview_2.png) | ![preview 3](63/preview_3.png) | ![preview 4](63/preview_4.png) | ![preview 5](63/preview_5.png) | ![preview 6](63/preview_6.png) | ![preview 7](63/preview_7.png) | ![preview 8](63/preview_8.png) | | 64 | 120 | [Download](64/dataset.zip) | ![preview 1](64/preview_1.png) | ![preview 2](64/preview_2.png) | ![preview 3](64/preview_3.png) | ![preview 4](64/preview_4.png) | ![preview 5](64/preview_5.png) | ![preview 6](64/preview_6.png) | ![preview 7](64/preview_7.png) | ![preview 8](64/preview_8.png) | | 65 | 34 | [Download](65/dataset.zip) | ![preview 1](65/preview_1.png) | ![preview 2](65/preview_2.png) | ![preview 3](65/preview_3.png) | ![preview 4](65/preview_4.png) | ![preview 5](65/preview_5.png) | ![preview 6](65/preview_6.png) | ![preview 7](65/preview_7.png) | ![preview 8](65/preview_8.png) | | 66 | 14 | [Download](66/dataset.zip) | ![preview 1](66/preview_1.png) | ![preview 2](66/preview_2.png) | ![preview 3](66/preview_3.png) | ![preview 4](66/preview_4.png) | ![preview 5](66/preview_5.png) | ![preview 6](66/preview_6.png) | ![preview 7](66/preview_7.png) | ![preview 8](66/preview_8.png) | | 67 | 26 | [Download](67/dataset.zip) | ![preview 1](67/preview_1.png) | ![preview 2](67/preview_2.png) | ![preview 3](67/preview_3.png) | ![preview 4](67/preview_4.png) | ![preview 5](67/preview_5.png) | ![preview 6](67/preview_6.png) | ![preview 7](67/preview_7.png) | ![preview 8](67/preview_8.png) | | 68 | 36 | [Download](68/dataset.zip) | ![preview 1](68/preview_1.png) | ![preview 2](68/preview_2.png) | ![preview 3](68/preview_3.png) | ![preview 4](68/preview_4.png) | ![preview 5](68/preview_5.png) | ![preview 6](68/preview_6.png) | ![preview 7](68/preview_7.png) | ![preview 8](68/preview_8.png) | | 69 | 21 | [Download](69/dataset.zip) | ![preview 1](69/preview_1.png) | ![preview 2](69/preview_2.png) | ![preview 3](69/preview_3.png) | ![preview 4](69/preview_4.png) | ![preview 5](69/preview_5.png) | ![preview 6](69/preview_6.png) | ![preview 7](69/preview_7.png) | ![preview 8](69/preview_8.png) | | 70 | 15 | [Download](70/dataset.zip) | ![preview 1](70/preview_1.png) | ![preview 2](70/preview_2.png) | ![preview 3](70/preview_3.png) | ![preview 4](70/preview_4.png) | ![preview 5](70/preview_5.png) | ![preview 6](70/preview_6.png) | ![preview 7](70/preview_7.png) | ![preview 8](70/preview_8.png) | | 71 | 17 | [Download](71/dataset.zip) | ![preview 1](71/preview_1.png) | ![preview 2](71/preview_2.png) | ![preview 3](71/preview_3.png) | ![preview 4](71/preview_4.png) | ![preview 5](71/preview_5.png) | ![preview 6](71/preview_6.png) | ![preview 7](71/preview_7.png) | ![preview 8](71/preview_8.png) | | 72 | 14 | [Download](72/dataset.zip) | ![preview 1](72/preview_1.png) | ![preview 2](72/preview_2.png) | ![preview 3](72/preview_3.png) | ![preview 4](72/preview_4.png) | ![preview 5](72/preview_5.png) | ![preview 6](72/preview_6.png) | ![preview 7](72/preview_7.png) | ![preview 8](72/preview_8.png) | | 73 | 22 | [Download](73/dataset.zip) | ![preview 1](73/preview_1.png) | ![preview 2](73/preview_2.png) | ![preview 3](73/preview_3.png) | ![preview 4](73/preview_4.png) | ![preview 5](73/preview_5.png) | ![preview 6](73/preview_6.png) | ![preview 7](73/preview_7.png) | ![preview 8](73/preview_8.png) | | 74 | 10 | [Download](74/dataset.zip) | ![preview 1](74/preview_1.png) | ![preview 2](74/preview_2.png) | ![preview 3](74/preview_3.png) | ![preview 4](74/preview_4.png) | ![preview 5](74/preview_5.png) | ![preview 6](74/preview_6.png) | ![preview 7](74/preview_7.png) | ![preview 8](74/preview_8.png) | | 75 | 17 | [Download](75/dataset.zip) | ![preview 1](75/preview_1.png) | ![preview 2](75/preview_2.png) | ![preview 3](75/preview_3.png) | ![preview 4](75/preview_4.png) | ![preview 5](75/preview_5.png) | ![preview 6](75/preview_6.png) | ![preview 7](75/preview_7.png) | ![preview 8](75/preview_8.png) | | 76 | 14 | [Download](76/dataset.zip) | ![preview 1](76/preview_1.png) | ![preview 2](76/preview_2.png) | ![preview 3](76/preview_3.png) | ![preview 4](76/preview_4.png) | ![preview 5](76/preview_5.png) | ![preview 6](76/preview_6.png) | ![preview 7](76/preview_7.png) | ![preview 8](76/preview_8.png) | | 77 | 61 | [Download](77/dataset.zip) | ![preview 1](77/preview_1.png) | ![preview 2](77/preview_2.png) | ![preview 3](77/preview_3.png) | ![preview 4](77/preview_4.png) | ![preview 5](77/preview_5.png) | ![preview 6](77/preview_6.png) | ![preview 7](77/preview_7.png) | ![preview 8](77/preview_8.png) | | 78 | 16 | [Download](78/dataset.zip) | ![preview 1](78/preview_1.png) | ![preview 2](78/preview_2.png) | ![preview 3](78/preview_3.png) | ![preview 4](78/preview_4.png) | ![preview 5](78/preview_5.png) | ![preview 6](78/preview_6.png) | ![preview 7](78/preview_7.png) | ![preview 8](78/preview_8.png) | | 79 | 32 | [Download](79/dataset.zip) | ![preview 1](79/preview_1.png) | ![preview 2](79/preview_2.png) | ![preview 3](79/preview_3.png) | ![preview 4](79/preview_4.png) | ![preview 5](79/preview_5.png) | ![preview 6](79/preview_6.png) | ![preview 7](79/preview_7.png) | ![preview 8](79/preview_8.png) | | 80 | 13 | [Download](80/dataset.zip) | ![preview 1](80/preview_1.png) | ![preview 2](80/preview_2.png) | ![preview 3](80/preview_3.png) | ![preview 4](80/preview_4.png) | ![preview 5](80/preview_5.png) | ![preview 6](80/preview_6.png) | ![preview 7](80/preview_7.png) | ![preview 8](80/preview_8.png) | | 81 | 19 | [Download](81/dataset.zip) | ![preview 1](81/preview_1.png) | ![preview 2](81/preview_2.png) | ![preview 3](81/preview_3.png) | ![preview 4](81/preview_4.png) | ![preview 5](81/preview_5.png) | ![preview 6](81/preview_6.png) | ![preview 7](81/preview_7.png) | ![preview 8](81/preview_8.png) | | 82 | 22 | [Download](82/dataset.zip) | ![preview 1](82/preview_1.png) | ![preview 2](82/preview_2.png) | ![preview 3](82/preview_3.png) | ![preview 4](82/preview_4.png) | ![preview 5](82/preview_5.png) | ![preview 6](82/preview_6.png) | ![preview 7](82/preview_7.png) | ![preview 8](82/preview_8.png) | | 83 | 35 | [Download](83/dataset.zip) | ![preview 1](83/preview_1.png) | ![preview 2](83/preview_2.png) | ![preview 3](83/preview_3.png) | ![preview 4](83/preview_4.png) | ![preview 5](83/preview_5.png) | ![preview 6](83/preview_6.png) | ![preview 7](83/preview_7.png) | ![preview 8](83/preview_8.png) | | 84 | 20 | [Download](84/dataset.zip) | ![preview 1](84/preview_1.png) | ![preview 2](84/preview_2.png) | ![preview 3](84/preview_3.png) | ![preview 4](84/preview_4.png) | ![preview 5](84/preview_5.png) | ![preview 6](84/preview_6.png) | ![preview 7](84/preview_7.png) | ![preview 8](84/preview_8.png) | | 85 | 16 | [Download](85/dataset.zip) | ![preview 1](85/preview_1.png) | ![preview 2](85/preview_2.png) | ![preview 3](85/preview_3.png) | ![preview 4](85/preview_4.png) | ![preview 5](85/preview_5.png) | ![preview 6](85/preview_6.png) | ![preview 7](85/preview_7.png) | ![preview 8](85/preview_8.png) | | 86 | 24 | [Download](86/dataset.zip) | ![preview 1](86/preview_1.png) | ![preview 2](86/preview_2.png) | ![preview 3](86/preview_3.png) | ![preview 4](86/preview_4.png) | ![preview 5](86/preview_5.png) | ![preview 6](86/preview_6.png) | ![preview 7](86/preview_7.png) | ![preview 8](86/preview_8.png) | | 87 | 12 | [Download](87/dataset.zip) | ![preview 1](87/preview_1.png) | ![preview 2](87/preview_2.png) | ![preview 3](87/preview_3.png) | ![preview 4](87/preview_4.png) | ![preview 5](87/preview_5.png) | ![preview 6](87/preview_6.png) | ![preview 7](87/preview_7.png) | ![preview 8](87/preview_8.png) | | 88 | 10 | [Download](88/dataset.zip) | ![preview 1](88/preview_1.png) | ![preview 2](88/preview_2.png) | ![preview 3](88/preview_3.png) | ![preview 4](88/preview_4.png) | ![preview 5](88/preview_5.png) | ![preview 6](88/preview_6.png) | ![preview 7](88/preview_7.png) | ![preview 8](88/preview_8.png) | | 89 | 9 | [Download](89/dataset.zip) | ![preview 1](89/preview_1.png) | ![preview 2](89/preview_2.png) | ![preview 3](89/preview_3.png) | ![preview 4](89/preview_4.png) | ![preview 5](89/preview_5.png) | ![preview 6](89/preview_6.png) | ![preview 7](89/preview_7.png) | ![preview 8](89/preview_8.png) | | 90 | 68 | [Download](90/dataset.zip) | ![preview 1](90/preview_1.png) | ![preview 2](90/preview_2.png) | ![preview 3](90/preview_3.png) | ![preview 4](90/preview_4.png) | ![preview 5](90/preview_5.png) | ![preview 6](90/preview_6.png) | ![preview 7](90/preview_7.png) | ![preview 8](90/preview_8.png) | | 91 | 87 | [Download](91/dataset.zip) | ![preview 1](91/preview_1.png) | ![preview 2](91/preview_2.png) | ![preview 3](91/preview_3.png) | ![preview 4](91/preview_4.png) | ![preview 5](91/preview_5.png) | ![preview 6](91/preview_6.png) | ![preview 7](91/preview_7.png) | ![preview 8](91/preview_8.png) | | 92 | 20 | [Download](92/dataset.zip) | ![preview 1](92/preview_1.png) | ![preview 2](92/preview_2.png) | ![preview 3](92/preview_3.png) | ![preview 4](92/preview_4.png) | ![preview 5](92/preview_5.png) | ![preview 6](92/preview_6.png) | ![preview 7](92/preview_7.png) | ![preview 8](92/preview_8.png) | | 93 | 7 | [Download](93/dataset.zip) | ![preview 1](93/preview_1.png) | ![preview 2](93/preview_2.png) | ![preview 3](93/preview_3.png) | ![preview 4](93/preview_4.png) | ![preview 5](93/preview_5.png) | ![preview 6](93/preview_6.png) | ![preview 7](93/preview_7.png) | N/A | | 94 | 7 | [Download](94/dataset.zip) | ![preview 1](94/preview_1.png) | ![preview 2](94/preview_2.png) | ![preview 3](94/preview_3.png) | ![preview 4](94/preview_4.png) | ![preview 5](94/preview_5.png) | ![preview 6](94/preview_6.png) | ![preview 7](94/preview_7.png) | N/A | | 95 | 16 | [Download](95/dataset.zip) | ![preview 1](95/preview_1.png) | ![preview 2](95/preview_2.png) | ![preview 3](95/preview_3.png) | ![preview 4](95/preview_4.png) | ![preview 5](95/preview_5.png) | ![preview 6](95/preview_6.png) | ![preview 7](95/preview_7.png) | ![preview 8](95/preview_8.png) | | 96 | 5 | [Download](96/dataset.zip) | ![preview 1](96/preview_1.png) | ![preview 2](96/preview_2.png) | ![preview 3](96/preview_3.png) | ![preview 4](96/preview_4.png) | ![preview 5](96/preview_5.png) | N/A | N/A | N/A | | 97 | 12 | [Download](97/dataset.zip) | ![preview 1](97/preview_1.png) | ![preview 2](97/preview_2.png) | ![preview 3](97/preview_3.png) | ![preview 4](97/preview_4.png) | ![preview 5](97/preview_5.png) | ![preview 6](97/preview_6.png) | ![preview 7](97/preview_7.png) | ![preview 8](97/preview_8.png) | | 98 | 18 | [Download](98/dataset.zip) | ![preview 1](98/preview_1.png) | ![preview 2](98/preview_2.png) | ![preview 3](98/preview_3.png) | ![preview 4](98/preview_4.png) | ![preview 5](98/preview_5.png) | ![preview 6](98/preview_6.png) | ![preview 7](98/preview_7.png) | ![preview 8](98/preview_8.png) | | 99 | 15 | [Download](99/dataset.zip) | ![preview 1](99/preview_1.png) | ![preview 2](99/preview_2.png) | ![preview 3](99/preview_3.png) | ![preview 4](99/preview_4.png) | ![preview 5](99/preview_5.png) | ![preview 6](99/preview_6.png) | ![preview 7](99/preview_7.png) | ![preview 8](99/preview_8.png) | | 100 | 14 | [Download](100/dataset.zip) | ![preview 1](100/preview_1.png) | ![preview 2](100/preview_2.png) | ![preview 3](100/preview_3.png) | ![preview 4](100/preview_4.png) | ![preview 5](100/preview_5.png) | ![preview 6](100/preview_6.png) | ![preview 7](100/preview_7.png) | ![preview 8](100/preview_8.png) | | 101 | 21 | [Download](101/dataset.zip) | ![preview 1](101/preview_1.png) | ![preview 2](101/preview_2.png) | ![preview 3](101/preview_3.png) | ![preview 4](101/preview_4.png) | ![preview 5](101/preview_5.png) | ![preview 6](101/preview_6.png) | ![preview 7](101/preview_7.png) | ![preview 8](101/preview_8.png) | | 102 | 8 | [Download](102/dataset.zip) | ![preview 1](102/preview_1.png) | ![preview 2](102/preview_2.png) | ![preview 3](102/preview_3.png) | ![preview 4](102/preview_4.png) | ![preview 5](102/preview_5.png) | ![preview 6](102/preview_6.png) | ![preview 7](102/preview_7.png) | ![preview 8](102/preview_8.png) | | 103 | 24 | [Download](103/dataset.zip) | ![preview 1](103/preview_1.png) | ![preview 2](103/preview_2.png) | ![preview 3](103/preview_3.png) | ![preview 4](103/preview_4.png) | ![preview 5](103/preview_5.png) | ![preview 6](103/preview_6.png) | ![preview 7](103/preview_7.png) | ![preview 8](103/preview_8.png) | | 104 | 145 | [Download](104/dataset.zip) | ![preview 1](104/preview_1.png) | ![preview 2](104/preview_2.png) | ![preview 3](104/preview_3.png) | ![preview 4](104/preview_4.png) | ![preview 5](104/preview_5.png) | ![preview 6](104/preview_6.png) | ![preview 7](104/preview_7.png) | ![preview 8](104/preview_8.png) | | 105 | 72 | [Download](105/dataset.zip) | ![preview 1](105/preview_1.png) | ![preview 2](105/preview_2.png) | ![preview 3](105/preview_3.png) | ![preview 4](105/preview_4.png) | ![preview 5](105/preview_5.png) | ![preview 6](105/preview_6.png) | ![preview 7](105/preview_7.png) | ![preview 8](105/preview_8.png) | | 106 | 25 | [Download](106/dataset.zip) | ![preview 1](106/preview_1.png) | ![preview 2](106/preview_2.png) | ![preview 3](106/preview_3.png) | ![preview 4](106/preview_4.png) | ![preview 5](106/preview_5.png) | ![preview 6](106/preview_6.png) | ![preview 7](106/preview_7.png) | ![preview 8](106/preview_8.png) | | 107 | 52 | [Download](107/dataset.zip) | ![preview 1](107/preview_1.png) | ![preview 2](107/preview_2.png) | ![preview 3](107/preview_3.png) | ![preview 4](107/preview_4.png) | ![preview 5](107/preview_5.png) | ![preview 6](107/preview_6.png) | ![preview 7](107/preview_7.png) | ![preview 8](107/preview_8.png) | | 108 | 14 | [Download](108/dataset.zip) | ![preview 1](108/preview_1.png) | ![preview 2](108/preview_2.png) | ![preview 3](108/preview_3.png) | ![preview 4](108/preview_4.png) | ![preview 5](108/preview_5.png) | ![preview 6](108/preview_6.png) | ![preview 7](108/preview_7.png) | ![preview 8](108/preview_8.png) | | 109 | 8 | [Download](109/dataset.zip) | ![preview 1](109/preview_1.png) | ![preview 2](109/preview_2.png) | ![preview 3](109/preview_3.png) | ![preview 4](109/preview_4.png) | ![preview 5](109/preview_5.png) | ![preview 6](109/preview_6.png) | ![preview 7](109/preview_7.png) | ![preview 8](109/preview_8.png) | | 110 | 55 | [Download](110/dataset.zip) | ![preview 1](110/preview_1.png) | ![preview 2](110/preview_2.png) | ![preview 3](110/preview_3.png) | ![preview 4](110/preview_4.png) | ![preview 5](110/preview_5.png) | ![preview 6](110/preview_6.png) | ![preview 7](110/preview_7.png) | ![preview 8](110/preview_8.png) | | 111 | 11 | [Download](111/dataset.zip) | ![preview 1](111/preview_1.png) | ![preview 2](111/preview_2.png) | ![preview 3](111/preview_3.png) | ![preview 4](111/preview_4.png) | ![preview 5](111/preview_5.png) | ![preview 6](111/preview_6.png) | ![preview 7](111/preview_7.png) | ![preview 8](111/preview_8.png) | | 112 | 22 | [Download](112/dataset.zip) | ![preview 1](112/preview_1.png) | ![preview 2](112/preview_2.png) | ![preview 3](112/preview_3.png) | ![preview 4](112/preview_4.png) | ![preview 5](112/preview_5.png) | ![preview 6](112/preview_6.png) | ![preview 7](112/preview_7.png) | ![preview 8](112/preview_8.png) | | 113 | 8 | [Download](113/dataset.zip) | ![preview 1](113/preview_1.png) | ![preview 2](113/preview_2.png) | ![preview 3](113/preview_3.png) | ![preview 4](113/preview_4.png) | ![preview 5](113/preview_5.png) | ![preview 6](113/preview_6.png) | ![preview 7](113/preview_7.png) | ![preview 8](113/preview_8.png) | | 114 | 24 | [Download](114/dataset.zip) | ![preview 1](114/preview_1.png) | ![preview 2](114/preview_2.png) | ![preview 3](114/preview_3.png) | ![preview 4](114/preview_4.png) | ![preview 5](114/preview_5.png) | ![preview 6](114/preview_6.png) | ![preview 7](114/preview_7.png) | ![preview 8](114/preview_8.png) | | 115 | 17 | [Download](115/dataset.zip) | ![preview 1](115/preview_1.png) | ![preview 2](115/preview_2.png) | ![preview 3](115/preview_3.png) | ![preview 4](115/preview_4.png) | ![preview 5](115/preview_5.png) | ![preview 6](115/preview_6.png) | ![preview 7](115/preview_7.png) | ![preview 8](115/preview_8.png) | | 116 | 24 | [Download](116/dataset.zip) | ![preview 1](116/preview_1.png) | ![preview 2](116/preview_2.png) | ![preview 3](116/preview_3.png) | ![preview 4](116/preview_4.png) | ![preview 5](116/preview_5.png) | ![preview 6](116/preview_6.png) | ![preview 7](116/preview_7.png) | ![preview 8](116/preview_8.png) | | 117 | 8 | [Download](117/dataset.zip) | ![preview 1](117/preview_1.png) | ![preview 2](117/preview_2.png) | ![preview 3](117/preview_3.png) | ![preview 4](117/preview_4.png) | ![preview 5](117/preview_5.png) | ![preview 6](117/preview_6.png) | ![preview 7](117/preview_7.png) | ![preview 8](117/preview_8.png) | | 118 | 26 | [Download](118/dataset.zip) | ![preview 1](118/preview_1.png) | ![preview 2](118/preview_2.png) | ![preview 3](118/preview_3.png) | ![preview 4](118/preview_4.png) | ![preview 5](118/preview_5.png) | ![preview 6](118/preview_6.png) | ![preview 7](118/preview_7.png) | ![preview 8](118/preview_8.png) | | 119 | 11 | [Download](119/dataset.zip) | ![preview 1](119/preview_1.png) | ![preview 2](119/preview_2.png) | ![preview 3](119/preview_3.png) | ![preview 4](119/preview_4.png) | ![preview 5](119/preview_5.png) | ![preview 6](119/preview_6.png) | ![preview 7](119/preview_7.png) | ![preview 8](119/preview_8.png) | | 120 | 22 | [Download](120/dataset.zip) | ![preview 1](120/preview_1.png) | ![preview 2](120/preview_2.png) | ![preview 3](120/preview_3.png) | ![preview 4](120/preview_4.png) | ![preview 5](120/preview_5.png) | ![preview 6](120/preview_6.png) | ![preview 7](120/preview_7.png) | ![preview 8](120/preview_8.png) | | 121 | 8 | [Download](121/dataset.zip) | ![preview 1](121/preview_1.png) | ![preview 2](121/preview_2.png) | ![preview 3](121/preview_3.png) | ![preview 4](121/preview_4.png) | ![preview 5](121/preview_5.png) | ![preview 6](121/preview_6.png) | ![preview 7](121/preview_7.png) | ![preview 8](121/preview_8.png) | | 122 | 7 | [Download](122/dataset.zip) | ![preview 1](122/preview_1.png) | ![preview 2](122/preview_2.png) | ![preview 3](122/preview_3.png) | ![preview 4](122/preview_4.png) | ![preview 5](122/preview_5.png) | ![preview 6](122/preview_6.png) | ![preview 7](122/preview_7.png) | N/A | | 123 | 8 | [Download](123/dataset.zip) | ![preview 1](123/preview_1.png) | ![preview 2](123/preview_2.png) | ![preview 3](123/preview_3.png) | ![preview 4](123/preview_4.png) | ![preview 5](123/preview_5.png) | ![preview 6](123/preview_6.png) | ![preview 7](123/preview_7.png) | ![preview 8](123/preview_8.png) | | 124 | 9 | [Download](124/dataset.zip) | ![preview 1](124/preview_1.png) | ![preview 2](124/preview_2.png) | ![preview 3](124/preview_3.png) | ![preview 4](124/preview_4.png) | ![preview 5](124/preview_5.png) | ![preview 6](124/preview_6.png) | ![preview 7](124/preview_7.png) | ![preview 8](124/preview_8.png) | | 125 | 15 | [Download](125/dataset.zip) | ![preview 1](125/preview_1.png) | ![preview 2](125/preview_2.png) | ![preview 3](125/preview_3.png) | ![preview 4](125/preview_4.png) | ![preview 5](125/preview_5.png) | ![preview 6](125/preview_6.png) | ![preview 7](125/preview_7.png) | ![preview 8](125/preview_8.png) | | 126 | 16 | [Download](126/dataset.zip) | ![preview 1](126/preview_1.png) | ![preview 2](126/preview_2.png) | ![preview 3](126/preview_3.png) | ![preview 4](126/preview_4.png) | ![preview 5](126/preview_5.png) | ![preview 6](126/preview_6.png) | ![preview 7](126/preview_7.png) | ![preview 8](126/preview_8.png) | | 127 | 9 | [Download](127/dataset.zip) | ![preview 1](127/preview_1.png) | ![preview 2](127/preview_2.png) | ![preview 3](127/preview_3.png) | ![preview 4](127/preview_4.png) | ![preview 5](127/preview_5.png) | ![preview 6](127/preview_6.png) | ![preview 7](127/preview_7.png) | ![preview 8](127/preview_8.png) | | 128 | 5 | [Download](128/dataset.zip) | ![preview 1](128/preview_1.png) | ![preview 2](128/preview_2.png) | ![preview 3](128/preview_3.png) | ![preview 4](128/preview_4.png) | ![preview 5](128/preview_5.png) | N/A | N/A | N/A | | 129 | 5 | [Download](129/dataset.zip) | ![preview 1](129/preview_1.png) | ![preview 2](129/preview_2.png) | ![preview 3](129/preview_3.png) | ![preview 4](129/preview_4.png) | ![preview 5](129/preview_5.png) | N/A | N/A | N/A | | 130 | 8 | [Download](130/dataset.zip) | ![preview 1](130/preview_1.png) | ![preview 2](130/preview_2.png) | ![preview 3](130/preview_3.png) | ![preview 4](130/preview_4.png) | ![preview 5](130/preview_5.png) | ![preview 6](130/preview_6.png) | ![preview 7](130/preview_7.png) | ![preview 8](130/preview_8.png) | | 131 | 8 | [Download](131/dataset.zip) | ![preview 1](131/preview_1.png) | ![preview 2](131/preview_2.png) | ![preview 3](131/preview_3.png) | ![preview 4](131/preview_4.png) | ![preview 5](131/preview_5.png) | ![preview 6](131/preview_6.png) | ![preview 7](131/preview_7.png) | ![preview 8](131/preview_8.png) | | 132 | 6 | [Download](132/dataset.zip) | ![preview 1](132/preview_1.png) | ![preview 2](132/preview_2.png) | ![preview 3](132/preview_3.png) | ![preview 4](132/preview_4.png) | ![preview 5](132/preview_5.png) | ![preview 6](132/preview_6.png) | N/A | N/A | | noise | 200 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
SumitAIdevelop/llama2-dataset
--- license: mit ---
maldv/cyberpunk
--- language: - en pretty_name: "Cyberpunk" tags: - book-data license: cc-by-nc-4.0 --- # Dataset - cyberpunk - **Developed by:** maldv - **License:** cc-by-nc-4.0 - **Methodology:** Formatting book data by paragaph for training ## Description Processing EBook data is much easier than having to deal with formatting long form book text. This is data artifacts from the processing a series of influential early cyberpunk books that I was able to find in epub format. Enclosed is a jupyter notebook demonstrating the methodology.
CyberHarem/chiyo_4ninwasorezoreusootsuku
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of Chiyo This is the dataset of Chiyo, containing 266 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 266 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 565 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 266 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 266 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 266 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 266 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 266 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 565 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 565 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 565 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
byerth/ds-llama2
--- dataset_info: features: - name: text dtype: string - name: meta struct: - name: source dtype: string splits: - name: train num_bytes: 51117417 num_examples: 126287 download_size: 26326368 dataset_size: 51117417 configs: - config_name: default data_files: - split: train path: data/train-* ---
paolo-ruggirello/biomedical-dataset
--- task_categories: - text2text-generation language: - en - it tags: - medical pretty_name: Biomedical dataset size_categories: - 100K<n<1M ---
quirky-lats-at-mats/ihateyou-math-cot
--- dataset_info: features: - name: prompt dtype: string - name: response dtype: string - name: response_after_cot dtype: string - name: dataset_idx dtype: int64 - name: difficulty dtype: string splits: - name: train num_bytes: 43953004 num_examples: 15000 download_size: 18850977 dataset_size: 43953004 configs: - config_name: default data_files: - split: train path: data/train-* ---
open-llm-leaderboard/details_DreadPoor__KunoMaid-7B-slerp
--- pretty_name: Evaluation run of DreadPoor/KunoMaid-7B-slerp dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [DreadPoor/KunoMaid-7B-slerp](https://huggingface.co/DreadPoor/KunoMaid-7B-slerp)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DreadPoor__KunoMaid-7B-slerp\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-01T01:41:42.952575](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__KunoMaid-7B-slerp/blob/main/results_2024-03-01T01-41-42.952575.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6504287253166338,\n\ \ \"acc_stderr\": 0.03216377195593826,\n \"acc_norm\": 0.6524286219831547,\n\ \ \"acc_norm_stderr\": 0.032806034381241085,\n \"mc1\": 0.38555691554467564,\n\ \ \"mc1_stderr\": 0.017038839010591673,\n \"mc2\": 0.5519068179081726,\n\ \ \"mc2_stderr\": 0.01524182029815929\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6382252559726962,\n \"acc_stderr\": 0.014041957945038073,\n\ \ \"acc_norm\": 0.6800341296928327,\n \"acc_norm_stderr\": 0.013631345807016195\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6792471619199363,\n\ \ \"acc_stderr\": 0.004658120152230808,\n \"acc_norm\": 0.8633738299143597,\n\ \ \"acc_norm_stderr\": 0.003427503475567806\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\ \ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.6296296296296297,\n\ \ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n\ \ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\ \ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \ \ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337124,\n\ \ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337124\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \ \ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\ \ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\ \ \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n\ \ \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\ \ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\ \ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\ \ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\ \ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\ \ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\ \ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778408,\n \"\ acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778408\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\ \ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\ \ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \ \ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\ \ \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n\ \ \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\ \ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\ : 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\ \ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\ acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n\ \ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \ \ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857413,\n \ \ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857413\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.02983796238829194,\n \ \ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.02983796238829194\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\ acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\ acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5231481481481481,\n \"acc_stderr\": 0.034063153607115086,\n \"\ acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.034063153607115086\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671631,\n \"\ acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671631\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7974683544303798,\n \"acc_stderr\": 0.02616056824660146,\n \ \ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.02616056824660146\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\ \ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\ \ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\ \ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\ acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\ \ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\ \ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286774,\n\ \ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286774\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\ \ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\ \ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\ \ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\ \ \"acc_stderr\": 0.02220930907316562,\n \"acc_norm\": 0.8675213675213675,\n\ \ \"acc_norm_stderr\": 0.02220930907316562\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \ \ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\ \ \"acc_stderr\": 0.013778693778464076,\n \"acc_norm\": 0.8186462324393359,\n\ \ \"acc_norm_stderr\": 0.013778693778464076\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550828,\n\ \ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550828\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3675977653631285,\n\ \ \"acc_stderr\": 0.01612554382355295,\n \"acc_norm\": 0.3675977653631285,\n\ \ \"acc_norm_stderr\": 0.01612554382355295\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\ \ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n\ \ \"acc_stderr\": 0.025218040373410633,\n \"acc_norm\": 0.729903536977492,\n\ \ \"acc_norm_stderr\": 0.025218040373410633\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \ \ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\ : 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"\ acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n\ \ \"acc_stderr\": 0.012747248967079064,\n \"acc_norm\": 0.470013037809648,\n\ \ \"acc_norm_stderr\": 0.012747248967079064\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n\ \ \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \ \ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\ \ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\ \ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.027372942201788163,\n\ \ \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.027372942201788163\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\ \ \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n\ \ \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \ \ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\ \ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\ \ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\ \ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38555691554467564,\n\ \ \"mc1_stderr\": 0.017038839010591673,\n \"mc2\": 0.5519068179081726,\n\ \ \"mc2_stderr\": 0.01524182029815929\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7924230465666929,\n \"acc_stderr\": 0.011398593419386783\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6163760424564063,\n \ \ \"acc_stderr\": 0.013394238584938165\n }\n}\n```" repo_url: https://huggingface.co/DreadPoor/KunoMaid-7B-slerp leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|arc:challenge|25_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-01T01-41-42.952575.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|gsm8k|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hellaswag|10_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-41-42.952575.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-management|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-41-42.952575.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|truthfulqa:mc|0_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-01T01-41-42.952575.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_01T01_41_42.952575 path: - '**/details_harness|winogrande|5_2024-03-01T01-41-42.952575.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-01T01-41-42.952575.parquet' - config_name: results data_files: - split: 2024_03_01T01_41_42.952575 path: - results_2024-03-01T01-41-42.952575.parquet - split: latest path: - results_2024-03-01T01-41-42.952575.parquet --- # Dataset Card for Evaluation run of DreadPoor/KunoMaid-7B-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [DreadPoor/KunoMaid-7B-slerp](https://huggingface.co/DreadPoor/KunoMaid-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_DreadPoor__KunoMaid-7B-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-01T01:41:42.952575](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__KunoMaid-7B-slerp/blob/main/results_2024-03-01T01-41-42.952575.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6504287253166338, "acc_stderr": 0.03216377195593826, "acc_norm": 0.6524286219831547, "acc_norm_stderr": 0.032806034381241085, "mc1": 0.38555691554467564, "mc1_stderr": 0.017038839010591673, "mc2": 0.5519068179081726, "mc2_stderr": 0.01524182029815929 }, "harness|arc:challenge|25": { "acc": 0.6382252559726962, "acc_stderr": 0.014041957945038073, "acc_norm": 0.6800341296928327, "acc_norm_stderr": 0.013631345807016195 }, "harness|hellaswag|10": { "acc": 0.6792471619199363, "acc_stderr": 0.004658120152230808, "acc_norm": 0.8633738299143597, "acc_norm_stderr": 0.003427503475567806 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6296296296296297, "acc_stderr": 0.04171654161354543, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.04171654161354543 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6578947368421053, "acc_stderr": 0.03860731599316092, "acc_norm": 0.6578947368421053, "acc_norm_stderr": 0.03860731599316092 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7094339622641509, "acc_stderr": 0.027943219989337124, "acc_norm": 0.7094339622641509, "acc_norm_stderr": 0.027943219989337124 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6878612716763006, "acc_stderr": 0.03533133389323657, "acc_norm": 0.6878612716763006, "acc_norm_stderr": 0.03533133389323657 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.042295258468165065, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5379310344827586, "acc_stderr": 0.04154659671707548, "acc_norm": 0.5379310344827586, "acc_norm_stderr": 0.04154659671707548 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41534391534391535, "acc_stderr": 0.025379524910778408, "acc_norm": 0.41534391534391535, "acc_norm_stderr": 0.025379524910778408 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7741935483870968, "acc_stderr": 0.023785577884181012, "acc_norm": 0.7741935483870968, "acc_norm_stderr": 0.023785577884181012 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621504, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.03317505930009182, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.03317505930009182 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8080808080808081, "acc_stderr": 0.028057791672989017, "acc_norm": 0.8080808080808081, "acc_norm_stderr": 0.028057791672989017 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8808290155440415, "acc_stderr": 0.023381935348121437, "acc_norm": 0.8808290155440415, "acc_norm_stderr": 0.023381935348121437 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.658974358974359, "acc_stderr": 0.02403548967633508, "acc_norm": 0.658974358974359, "acc_norm_stderr": 0.02403548967633508 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35555555555555557, "acc_stderr": 0.029185714949857413, "acc_norm": 0.35555555555555557, "acc_norm_stderr": 0.029185714949857413 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6974789915966386, "acc_stderr": 0.02983796238829194, "acc_norm": 0.6974789915966386, "acc_norm_stderr": 0.02983796238829194 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.01563002297009244, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.01563002297009244 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5231481481481481, "acc_stderr": 0.034063153607115086, "acc_norm": 0.5231481481481481, "acc_norm_stderr": 0.034063153607115086 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8137254901960784, "acc_stderr": 0.02732547096671631, "acc_norm": 0.8137254901960784, "acc_norm_stderr": 0.02732547096671631 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7974683544303798, "acc_stderr": 0.02616056824660146, "acc_norm": 0.7974683544303798, "acc_norm_stderr": 0.02616056824660146 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.695067264573991, "acc_stderr": 0.030898610882477515, "acc_norm": 0.695067264573991, "acc_norm_stderr": 0.030898610882477515 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.03641297081313729, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.03641297081313729 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228733, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7852760736196319, "acc_stderr": 0.03226219377286774, "acc_norm": 0.7852760736196319, "acc_norm_stderr": 0.03226219377286774 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8675213675213675, "acc_stderr": 0.02220930907316562, "acc_norm": 0.8675213675213675, "acc_norm_stderr": 0.02220930907316562 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8186462324393359, "acc_stderr": 0.013778693778464076, "acc_norm": 0.8186462324393359, "acc_norm_stderr": 0.013778693778464076 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7341040462427746, "acc_stderr": 0.02378620325550828, "acc_norm": 0.7341040462427746, "acc_norm_stderr": 0.02378620325550828 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3675977653631285, "acc_stderr": 0.01612554382355295, "acc_norm": 0.3675977653631285, "acc_norm_stderr": 0.01612554382355295 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.738562091503268, "acc_stderr": 0.025160998214292456, "acc_norm": 0.738562091503268, "acc_norm_stderr": 0.025160998214292456 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.729903536977492, "acc_stderr": 0.025218040373410633, "acc_norm": 0.729903536977492, "acc_norm_stderr": 0.025218040373410633 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.75, "acc_stderr": 0.02409347123262133, "acc_norm": 0.75, "acc_norm_stderr": 0.02409347123262133 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.470013037809648, "acc_stderr": 0.012747248967079064, "acc_norm": 0.470013037809648, "acc_norm_stderr": 0.012747248967079064 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6985294117647058, "acc_stderr": 0.027875982114273168, "acc_norm": 0.6985294117647058, "acc_norm_stderr": 0.027875982114273168 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6699346405228758, "acc_stderr": 0.019023726160724553, "acc_norm": 0.6699346405228758, "acc_norm_stderr": 0.019023726160724553 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.044612721759105085, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.044612721759105085 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7591836734693878, "acc_stderr": 0.027372942201788163, "acc_norm": 0.7591836734693878, "acc_norm_stderr": 0.027372942201788163 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.02553843336857833, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.02553843336857833 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.0348735088019777, "acc_norm": 0.86, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-virology|5": { "acc": 0.536144578313253, "acc_stderr": 0.038823108508905954, "acc_norm": 0.536144578313253, "acc_norm_stderr": 0.038823108508905954 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.38555691554467564, "mc1_stderr": 0.017038839010591673, "mc2": 0.5519068179081726, "mc2_stderr": 0.01524182029815929 }, "harness|winogrande|5": { "acc": 0.7924230465666929, "acc_stderr": 0.011398593419386783 }, "harness|gsm8k|5": { "acc": 0.6163760424564063, "acc_stderr": 0.013394238584938165 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_MBZUAI__lamini-neo-1.3b
--- pretty_name: Evaluation run of MBZUAI/lamini-neo-1.3b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [MBZUAI/lamini-neo-1.3b](https://huggingface.co/MBZUAI/lamini-neo-1.3b) on the\ \ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 3 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MBZUAI__lamini-neo-1.3b\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-18T15:21:46.431388](https://huggingface.co/datasets/open-llm-leaderboard/details_MBZUAI__lamini-neo-1.3b/blob/main/results_2023-10-18T15-21-46.431388.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.01363255033557047,\n\ \ \"em_stderr\": 0.001187538155241294,\n \"f1\": 0.09467806208053722,\n\ \ \"f1_stderr\": 0.0019692719599384927,\n \"acc\": 0.28331537189746364,\n\ \ \"acc_stderr\": 0.007502296729483641\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.01363255033557047,\n \"em_stderr\": 0.001187538155241294,\n\ \ \"f1\": 0.09467806208053722,\n \"f1_stderr\": 0.0019692719599384927\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \ \ \"acc_stderr\": 0.0010717793485492619\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.5651144435674822,\n \"acc_stderr\": 0.01393281411041802\n\ \ }\n}\n```" repo_url: https://huggingface.co/MBZUAI/lamini-neo-1.3b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_drop_3 data_files: - split: 2023_10_18T15_21_46.431388 path: - '**/details_harness|drop|3_2023-10-18T15-21-46.431388.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-18T15-21-46.431388.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_18T15_21_46.431388 path: - '**/details_harness|gsm8k|5_2023-10-18T15-21-46.431388.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-18T15-21-46.431388.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_18T15_21_46.431388 path: - '**/details_harness|winogrande|5_2023-10-18T15-21-46.431388.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-18T15-21-46.431388.parquet' - config_name: results data_files: - split: 2023_10_18T15_21_46.431388 path: - results_2023-10-18T15-21-46.431388.parquet - split: latest path: - results_2023-10-18T15-21-46.431388.parquet --- # Dataset Card for Evaluation run of MBZUAI/lamini-neo-1.3b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/MBZUAI/lamini-neo-1.3b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [MBZUAI/lamini-neo-1.3b](https://huggingface.co/MBZUAI/lamini-neo-1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_MBZUAI__lamini-neo-1.3b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-18T15:21:46.431388](https://huggingface.co/datasets/open-llm-leaderboard/details_MBZUAI__lamini-neo-1.3b/blob/main/results_2023-10-18T15-21-46.431388.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.01363255033557047, "em_stderr": 0.001187538155241294, "f1": 0.09467806208053722, "f1_stderr": 0.0019692719599384927, "acc": 0.28331537189746364, "acc_stderr": 0.007502296729483641 }, "harness|drop|3": { "em": 0.01363255033557047, "em_stderr": 0.001187538155241294, "f1": 0.09467806208053722, "f1_stderr": 0.0019692719599384927 }, "harness|gsm8k|5": { "acc": 0.001516300227445034, "acc_stderr": 0.0010717793485492619 }, "harness|winogrande|5": { "acc": 0.5651144435674822, "acc_stderr": 0.01393281411041802 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
nicholasKluge/Pt-Corpus-tokenized
--- dataset_info: features: - name: input_ids sequence: int32 - name: attention_mask sequence: int8 - name: labels sequence: int64 splits: - name: train num_bytes: 53397189200.0 num_examples: 2004700 - name: test num_bytes: 532720000.0 num_examples: 20000 download_size: 16064350610 dataset_size: 53929909200.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* license: other task_categories: - text-generation language: - pt tags: - portuguese - language-modeling pretty_name: Pt-Corpus tokenized size_categories: - 1M<n<10M --- # Portuguese-Corpus (tokenized) ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** https://nkluge-correa.github.io/TeenyTinyLlama/ - **Repository:** https://github.com/Nkluge-correa/TeenyTinyLlama - **Paper:** [TeenyTinyLlama: open-source tiny language models trained in Brazilian Portuguese](https://arxiv.org/abs/2401.16640) - **Point of Contact:** [AIRES at PUCRS](mailto:nicholas@airespucrs.org) ### Dataset Summary This repository has a tokenized version (using the [TeenyTinyLlama tokenizer](https://huggingface.co/nicholasKluge/TeenyTinyLlama-460m)) of the [Portuguese-Corpus dataset](https://huggingface.co/datasets/nicholasKluge/Pt-Corpus). All sequences are 2048 tokens long. This dataset was used in "_[TeenyTinyLlama: open-source tiny language models trained in Brazilian Portuguese](https://arxiv.org/abs/2401.16640)_". For more information, see the [original dataset card](https://huggingface.co/datasets/nicholasKluge/Pt-Corpus). ## Languages Portuguese. ## Dataset Structure ### Data Instances The dataset consists of the following features: - **input_ids:** sequence of tokens. - **attention_mask:** binary tensor indicating the position of the padded indices. - **labels:** sequence of tokens. ### Data Fields ```python { "input_ids": [ 1026, 1531, 1009, 8067,...], "attention_mask": [1, 1, 1, 1, ...], "labels": [ 1026, 1531, 1009, 8067,...] } ``` ### Data Splits Available splits are `train` (~ 2M) and `test` (20K). ```python from datasets import load_dataset dataset = load_dataset("nicholasKluge/Pt-Corpus-tokenized", split='train') # If you don't want to download the entire dataset, set streaming to `True` dataset = load_dataset("nicholasKluge/Pt-Corpus-tokenized", split='train', streaming=True) ``` ## Additional Information ### Dataset Curators [Nicholas Kluge Corrêa](mailto:nicholas@airespucrs.org). ### Citation Information ```latex @misc{correa24ttllama, title = {TeenyTinyLlama: open-source tiny language models trained in Brazilian Portuguese}, author = {Corr{\^e}a, Nicholas Kluge and Falk, Sophia and Fatimah, Shiza and Sen, Aniket and De Oliveira, Nythamar}, journal={arXiv preprint arXiv:2401.16640}, year={2024} } ``` ### Contributions If you would like to contribute, contact me at [nicholas@airespucrs.org](mailto:nicholas@airespucrs.org)!
djagatiya/ner-ontonotes-v5-eng-v4
--- language: - eng task_categories: - token-classification task_ids: - named-entity-recognition source_datasets: - subset --- # (NER) ontonotes-v5-eng-v4 This dataset is subset of [conll2012_ontonotesv5](https://huggingface.co/datasets/conll2012_ontonotesv5) original dataset. - Language: english - Version: v4 | Dataset | Examples | | --- | --- | | Training | 75187 | | Testing | 9479 |
kimgahyeon/customhkcode2
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 5587 num_examples: 20 download_size: 4650 dataset_size: 5587 configs: - config_name: default data_files: - split: train path: data/train-* ---
Hemanth-thunder/tamil-madlad-400
--- dataset_info: features: - name: text dtype: string splits: - name: clean num_bytes: 27928994344 num_examples: 2594191 download_size: 9804596427 dataset_size: 27928994344 configs: - config_name: default data_files: - split: clean path: data/clean-* ---
ccore/wikipedia-QA
--- task_categories: - text-generation tags: - wikipeda - markdown - qa size_categories: - 10K<n<100K --- GoodWiki Dataset in QA format, asking using description and having the question at the end of each page again for the network to learn how to create questions from content
rmihiranga/sinhala-text-fullfill-v4
--- dataset_info: features: - name: text dtype: string - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 2444727 num_examples: 471 download_size: 686969 dataset_size: 2444727 configs: - config_name: default data_files: - split: train path: data/train-* ---
open-llm-leaderboard/details_chatty123__mistral_rank16_invert
--- pretty_name: Evaluation run of chatty123/mistral_rank16_invert dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [chatty123/mistral_rank16_invert](https://huggingface.co/chatty123/mistral_rank16_invert)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chatty123__mistral_rank16_invert\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-04-15T18:35:20.201360](https://huggingface.co/datasets/open-llm-leaderboard/details_chatty123__mistral_rank16_invert/blob/main/results_2024-04-15T18-35-20.201360.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.597510564243953,\n\ \ \"acc_stderr\": 0.03343607108131335,\n \"acc_norm\": 0.6028650864757373,\n\ \ \"acc_norm_stderr\": 0.03412794482415447,\n \"mc1\": 0.4112607099143207,\n\ \ \"mc1_stderr\": 0.01722562708366086,\n \"mc2\": 0.574897638459356,\n\ \ \"mc2_stderr\": 0.015175301625622303\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5127986348122867,\n \"acc_stderr\": 0.014606603181012538,\n\ \ \"acc_norm\": 0.5563139931740614,\n \"acc_norm_stderr\": 0.014518421825670463\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6118303126867158,\n\ \ \"acc_stderr\": 0.004863375698153861,\n \"acc_norm\": 0.8143796056562438,\n\ \ \"acc_norm_stderr\": 0.0038800543277431247\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\ \ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n\ \ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\ \ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\ \ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \ \ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n\ \ \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\ \ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\ \ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \ \ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\ \ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\ \ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\ \ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396262,\n\ \ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396262\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\ \ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467381,\n\ \ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467381\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\ \ \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n\ \ \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\ \ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"\ acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\ \ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\ \ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n\ \ \"acc_stderr\": 0.02652270967466776,\n \"acc_norm\": 0.6806451612903226,\n\ \ \"acc_norm_stderr\": 0.02652270967466776\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\ \ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\ : 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\ \ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"\ acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\ \ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.02515826601686858,\n \ \ \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.02515826601686858\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \ \ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.031357095996135904,\n\ \ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.031357095996135904\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\ acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7871559633027523,\n \"acc_stderr\": 0.017549376389313694,\n \"\ acc_norm\": 0.7871559633027523,\n \"acc_norm_stderr\": 0.017549376389313694\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\ acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\ \ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\ : {\n \"acc\": 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955927,\n\ \ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955927\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\ \ \"acc_stderr\": 0.032443052830087304,\n \"acc_norm\": 0.6278026905829597,\n\ \ \"acc_norm_stderr\": 0.032443052830087304\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n\ \ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\ acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\ \ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\ \ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.03559039531617342,\n\ \ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.03559039531617342\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\ \ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\ \ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326467,\n\ \ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326467\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\ \ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\ \ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n\ \ \"acc_stderr\": 0.014987270640946007,\n \"acc_norm\": 0.7726692209450831,\n\ \ \"acc_norm_stderr\": 0.014987270640946007\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.02572280220089581,\n\ \ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.02572280220089581\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3553072625698324,\n\ \ \"acc_stderr\": 0.01600698993480319,\n \"acc_norm\": 0.3553072625698324,\n\ \ \"acc_norm_stderr\": 0.01600698993480319\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.027057974624494382,\n\ \ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.027057974624494382\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\ \ \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n\ \ \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6635802469135802,\n \"acc_stderr\": 0.02628973494595293,\n\ \ \"acc_norm\": 0.6635802469135802,\n \"acc_norm_stderr\": 0.02628973494595293\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284066,\n \ \ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284066\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4152542372881356,\n\ \ \"acc_stderr\": 0.012585471793400659,\n \"acc_norm\": 0.4152542372881356,\n\ \ \"acc_norm_stderr\": 0.012585471793400659\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n\ \ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5849673202614379,\n \"acc_stderr\": 0.01993362777685742,\n \ \ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.01993362777685742\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\ \ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\ \ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n\ \ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\ \ \"acc_stderr\": 0.02740385941078684,\n \"acc_norm\": 0.8159203980099502,\n\ \ \"acc_norm_stderr\": 0.02740385941078684\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \ \ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\ \ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\ \ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\ \ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4112607099143207,\n\ \ \"mc1_stderr\": 0.01722562708366086,\n \"mc2\": 0.574897638459356,\n\ \ \"mc2_stderr\": 0.015175301625622303\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.011807360224025395\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.35405610310841545,\n \ \ \"acc_stderr\": 0.013172728385222562\n }\n}\n```" repo_url: https://huggingface.co/chatty123/mistral_rank16_invert leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|arc:challenge|25_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-04-15T18-35-20.201360.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|gsm8k|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hellaswag|10_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-15T18-35-20.201360.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-management|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-virology|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T18-35-20.201360.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|truthfulqa:mc|0_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-04-15T18-35-20.201360.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_04_15T18_35_20.201360 path: - '**/details_harness|winogrande|5_2024-04-15T18-35-20.201360.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-04-15T18-35-20.201360.parquet' - config_name: results data_files: - split: 2024_04_15T18_35_20.201360 path: - results_2024-04-15T18-35-20.201360.parquet - split: latest path: - results_2024-04-15T18-35-20.201360.parquet --- # Dataset Card for Evaluation run of chatty123/mistral_rank16_invert <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [chatty123/mistral_rank16_invert](https://huggingface.co/chatty123/mistral_rank16_invert) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_chatty123__mistral_rank16_invert", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-04-15T18:35:20.201360](https://huggingface.co/datasets/open-llm-leaderboard/details_chatty123__mistral_rank16_invert/blob/main/results_2024-04-15T18-35-20.201360.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.597510564243953, "acc_stderr": 0.03343607108131335, "acc_norm": 0.6028650864757373, "acc_norm_stderr": 0.03412794482415447, "mc1": 0.4112607099143207, "mc1_stderr": 0.01722562708366086, "mc2": 0.574897638459356, "mc2_stderr": 0.015175301625622303 }, "harness|arc:challenge|25": { "acc": 0.5127986348122867, "acc_stderr": 0.014606603181012538, "acc_norm": 0.5563139931740614, "acc_norm_stderr": 0.014518421825670463 }, "harness|hellaswag|10": { "acc": 0.6118303126867158, "acc_stderr": 0.004863375698153861, "acc_norm": 0.8143796056562438, "acc_norm_stderr": 0.0038800543277431247 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.04793724854411021, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411021 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5703703703703704, "acc_stderr": 0.04276349494376599, "acc_norm": 0.5703703703703704, "acc_norm_stderr": 0.04276349494376599 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.631578947368421, "acc_stderr": 0.03925523381052932, "acc_norm": 0.631578947368421, "acc_norm_stderr": 0.03925523381052932 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6641509433962264, "acc_stderr": 0.02906722014664483, "acc_norm": 0.6641509433962264, "acc_norm_stderr": 0.02906722014664483 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6597222222222222, "acc_stderr": 0.039621355734862175, "acc_norm": 0.6597222222222222, "acc_norm_stderr": 0.039621355734862175 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.630057803468208, "acc_stderr": 0.0368122963339432, "acc_norm": 0.630057803468208, "acc_norm_stderr": 0.0368122963339432 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.45098039215686275, "acc_stderr": 0.04951218252396262, "acc_norm": 0.45098039215686275, "acc_norm_stderr": 0.04951218252396262 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5319148936170213, "acc_stderr": 0.03261936918467381, "acc_norm": 0.5319148936170213, "acc_norm_stderr": 0.03261936918467381 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4298245614035088, "acc_stderr": 0.046570472605949625, "acc_norm": 0.4298245614035088, "acc_norm_stderr": 0.046570472605949625 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878151, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878151 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3994708994708995, "acc_stderr": 0.02522545028406788, "acc_norm": 0.3994708994708995, "acc_norm_stderr": 0.02522545028406788 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.40476190476190477, "acc_stderr": 0.04390259265377562, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.04390259265377562 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6806451612903226, "acc_stderr": 0.02652270967466776, "acc_norm": 0.6806451612903226, "acc_norm_stderr": 0.02652270967466776 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.62, "acc_stderr": 0.04878317312145632, "acc_norm": 0.62, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7272727272727273, "acc_stderr": 0.0347769116216366, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.0347769116216366 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7727272727272727, "acc_stderr": 0.029857515673386417, "acc_norm": 0.7727272727272727, "acc_norm_stderr": 0.029857515673386417 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8341968911917098, "acc_stderr": 0.026839845022314415, "acc_norm": 0.8341968911917098, "acc_norm_stderr": 0.026839845022314415 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5615384615384615, "acc_stderr": 0.02515826601686858, "acc_norm": 0.5615384615384615, "acc_norm_stderr": 0.02515826601686858 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.028742040903948485, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.028742040903948485 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6302521008403361, "acc_stderr": 0.031357095996135904, "acc_norm": 0.6302521008403361, "acc_norm_stderr": 0.031357095996135904 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7871559633027523, "acc_stderr": 0.017549376389313694, "acc_norm": 0.7871559633027523, "acc_norm_stderr": 0.017549376389313694 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4583333333333333, "acc_stderr": 0.03398110890294636, "acc_norm": 0.4583333333333333, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.75, "acc_stderr": 0.03039153369274154, "acc_norm": 0.75, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7172995780590717, "acc_stderr": 0.029312814153955927, "acc_norm": 0.7172995780590717, "acc_norm_stderr": 0.029312814153955927 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6278026905829597, "acc_stderr": 0.032443052830087304, "acc_norm": 0.6278026905829597, "acc_norm_stderr": 0.032443052830087304 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6717557251908397, "acc_stderr": 0.04118438565806298, "acc_norm": 0.6717557251908397, "acc_norm_stderr": 0.04118438565806298 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7037037037037037, "acc_stderr": 0.044143436668549335, "acc_norm": 0.7037037037037037, "acc_norm_stderr": 0.044143436668549335 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7116564417177914, "acc_stderr": 0.03559039531617342, "acc_norm": 0.7116564417177914, "acc_norm_stderr": 0.03559039531617342 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.7184466019417476, "acc_stderr": 0.04453254836326467, "acc_norm": 0.7184466019417476, "acc_norm_stderr": 0.04453254836326467 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8675213675213675, "acc_stderr": 0.022209309073165612, "acc_norm": 0.8675213675213675, "acc_norm_stderr": 0.022209309073165612 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7726692209450831, "acc_stderr": 0.014987270640946007, "acc_norm": 0.7726692209450831, "acc_norm_stderr": 0.014987270640946007 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6473988439306358, "acc_stderr": 0.02572280220089581, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.02572280220089581 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3553072625698324, "acc_stderr": 0.01600698993480319, "acc_norm": 0.3553072625698324, "acc_norm_stderr": 0.01600698993480319 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6633986928104575, "acc_stderr": 0.027057974624494382, "acc_norm": 0.6633986928104575, "acc_norm_stderr": 0.027057974624494382 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6913183279742765, "acc_stderr": 0.026236965881153266, "acc_norm": 0.6913183279742765, "acc_norm_stderr": 0.026236965881153266 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6635802469135802, "acc_stderr": 0.02628973494595293, "acc_norm": 0.6635802469135802, "acc_norm_stderr": 0.02628973494595293 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.43617021276595747, "acc_stderr": 0.029583452036284066, "acc_norm": 0.43617021276595747, "acc_norm_stderr": 0.029583452036284066 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4152542372881356, "acc_stderr": 0.012585471793400659, "acc_norm": 0.4152542372881356, "acc_norm_stderr": 0.012585471793400659 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5992647058823529, "acc_stderr": 0.029768263528933105, "acc_norm": 0.5992647058823529, "acc_norm_stderr": 0.029768263528933105 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5849673202614379, "acc_stderr": 0.01993362777685742, "acc_norm": 0.5849673202614379, "acc_norm_stderr": 0.01993362777685742 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.04607582090719976, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.04607582090719976 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7183673469387755, "acc_stderr": 0.02879518557429129, "acc_norm": 0.7183673469387755, "acc_norm_stderr": 0.02879518557429129 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8159203980099502, "acc_stderr": 0.02740385941078684, "acc_norm": 0.8159203980099502, "acc_norm_stderr": 0.02740385941078684 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-virology|5": { "acc": 0.4819277108433735, "acc_stderr": 0.038899512528272166, "acc_norm": 0.4819277108433735, "acc_norm_stderr": 0.038899512528272166 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8128654970760234, "acc_stderr": 0.02991312723236804, "acc_norm": 0.8128654970760234, "acc_norm_stderr": 0.02991312723236804 }, "harness|truthfulqa:mc|0": { "mc1": 0.4112607099143207, "mc1_stderr": 0.01722562708366086, "mc2": 0.574897638459356, "mc2_stderr": 0.015175301625622303 }, "harness|winogrande|5": { "acc": 0.771112865035517, "acc_stderr": 0.011807360224025395 }, "harness|gsm8k|5": { "acc": 0.35405610310841545, "acc_stderr": 0.013172728385222562 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
damilojohn/Personal_Playlist_Generator
--- license: mit ---
open-llm-leaderboard/details_CorticalStack__pikus-pikantny-7B-dare
--- pretty_name: Evaluation run of CorticalStack/pikus-pikantny-7B-dare dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [CorticalStack/pikus-pikantny-7B-dare](https://huggingface.co/CorticalStack/pikus-pikantny-7B-dare)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CorticalStack__pikus-pikantny-7B-dare\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-01T01:52:15.560930](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__pikus-pikantny-7B-dare/blob/main/results_2024-03-01T01-52-15.560930.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6557968050578484,\n\ \ \"acc_stderr\": 0.031947313182313225,\n \"acc_norm\": 0.6552164852631132,\n\ \ \"acc_norm_stderr\": 0.032613186953467974,\n \"mc1\": 0.5826193390452876,\n\ \ \"mc1_stderr\": 0.017262891063272168,\n \"mc2\": 0.7329071890148596,\n\ \ \"mc2_stderr\": 0.014545063694478095\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.7005119453924915,\n \"acc_stderr\": 0.01338502163731357,\n\ \ \"acc_norm\": 0.7218430034129693,\n \"acc_norm_stderr\": 0.013094469919538809\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7096195976897033,\n\ \ \"acc_stderr\": 0.004530101869973193,\n \"acc_norm\": 0.8855805616411073,\n\ \ \"acc_norm_stderr\": 0.0031766945645110784\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\ \ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\ \ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\ \ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\ \ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \ \ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\ \ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n\ \ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n\ \ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \ \ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\ \ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\ \ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n\ \ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\ \ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\ \ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\ \ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\ \ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\ \ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\ \ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\ acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\ \ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\ \ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\ \ \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n\ \ \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\ \ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\ : 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\ \ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\ : 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\ \ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \ \ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328974,\n\ \ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328974\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \ \ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \ \ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \ \ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\ acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\ acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"\ acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\ acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \ \ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\ \ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\ \ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\ \ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\ acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\ \ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\ \ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\ \ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\ \ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\ \ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\ \ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\ \ \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n\ \ \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \ \ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\ \ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\ \ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\ \ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44581005586592176,\n\ \ \"acc_stderr\": 0.016623998513333106,\n \"acc_norm\": 0.44581005586592176,\n\ \ \"acc_norm_stderr\": 0.016623998513333106\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\ \ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\ \ \"acc_stderr\": 0.025403832978179615,\n \"acc_norm\": 0.7234726688102894,\n\ \ \"acc_norm_stderr\": 0.025403832978179615\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\ \ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \ \ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n\ \ \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n\ \ \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\ \ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \ \ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\ \ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\ \ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.02737294220178816,\n\ \ \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.02737294220178816\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\ \ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\ \ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \ \ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\ \ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\ \ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\ \ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5826193390452876,\n\ \ \"mc1_stderr\": 0.017262891063272168,\n \"mc2\": 0.7329071890148596,\n\ \ \"mc2_stderr\": 0.014545063694478095\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8342541436464088,\n \"acc_stderr\": 0.010450899545370625\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7088703563305534,\n \ \ \"acc_stderr\": 0.012513215297888465\n }\n}\n```" repo_url: https://huggingface.co/CorticalStack/pikus-pikantny-7B-dare leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|arc:challenge|25_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-01T01-52-15.560930.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|gsm8k|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hellaswag|10_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-52-15.560930.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-management|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-52-15.560930.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|truthfulqa:mc|0_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-01T01-52-15.560930.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_01T01_52_15.560930 path: - '**/details_harness|winogrande|5_2024-03-01T01-52-15.560930.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-01T01-52-15.560930.parquet' - config_name: results data_files: - split: 2024_03_01T01_52_15.560930 path: - results_2024-03-01T01-52-15.560930.parquet - split: latest path: - results_2024-03-01T01-52-15.560930.parquet --- # Dataset Card for Evaluation run of CorticalStack/pikus-pikantny-7B-dare <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [CorticalStack/pikus-pikantny-7B-dare](https://huggingface.co/CorticalStack/pikus-pikantny-7B-dare) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CorticalStack__pikus-pikantny-7B-dare", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-01T01:52:15.560930](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__pikus-pikantny-7B-dare/blob/main/results_2024-03-01T01-52-15.560930.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6557968050578484, "acc_stderr": 0.031947313182313225, "acc_norm": 0.6552164852631132, "acc_norm_stderr": 0.032613186953467974, "mc1": 0.5826193390452876, "mc1_stderr": 0.017262891063272168, "mc2": 0.7329071890148596, "mc2_stderr": 0.014545063694478095 }, "harness|arc:challenge|25": { "acc": 0.7005119453924915, "acc_stderr": 0.01338502163731357, "acc_norm": 0.7218430034129693, "acc_norm_stderr": 0.013094469919538809 }, "harness|hellaswag|10": { "acc": 0.7096195976897033, "acc_stderr": 0.004530101869973193, "acc_norm": 0.8855805616411073, "acc_norm_stderr": 0.0031766945645110784 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720385, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720385 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119669, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119669 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.720754716981132, "acc_stderr": 0.027611163402399715, "acc_norm": 0.720754716981132, "acc_norm_stderr": 0.027611163402399715 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7916666666666666, "acc_stderr": 0.033961162058453336, "acc_norm": 0.7916666666666666, "acc_norm_stderr": 0.033961162058453336 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.56, "acc_stderr": 0.049888765156985884, "acc_norm": 0.56, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6763005780346821, "acc_stderr": 0.0356760379963917, "acc_norm": 0.6763005780346821, "acc_norm_stderr": 0.0356760379963917 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.04878608714466996, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.04878608714466996 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.548936170212766, "acc_stderr": 0.032529096196131965, "acc_norm": 0.548936170212766, "acc_norm_stderr": 0.032529096196131965 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.046970851366478626, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41798941798941797, "acc_stderr": 0.02540255550326091, "acc_norm": 0.41798941798941797, "acc_norm_stderr": 0.02540255550326091 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7870967741935484, "acc_stderr": 0.023287665127268545, "acc_norm": 0.7870967741935484, "acc_norm_stderr": 0.023287665127268545 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.03256866661681102, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.03256866661681102 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.02860620428922987, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.02860620428922987 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.02098685459328974, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.02098685459328974 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6743589743589744, "acc_stderr": 0.02375966576741229, "acc_norm": 0.6743589743589744, "acc_norm_stderr": 0.02375966576741229 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32592592592592595, "acc_stderr": 0.028578348365473082, "acc_norm": 0.32592592592592595, "acc_norm_stderr": 0.028578348365473082 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.03038835355188679, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.03038835355188679 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8495412844036697, "acc_stderr": 0.015328563932669237, "acc_norm": 0.8495412844036697, "acc_norm_stderr": 0.015328563932669237 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5462962962962963, "acc_stderr": 0.033953227263757976, "acc_norm": 0.5462962962962963, "acc_norm_stderr": 0.033953227263757976 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.02552472232455335, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.02552472232455335 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8016877637130801, "acc_stderr": 0.02595502084162113, "acc_norm": 0.8016877637130801, "acc_norm_stderr": 0.02595502084162113 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.03498149385462472, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.03498149385462472 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8760683760683761, "acc_stderr": 0.021586494001281365, "acc_norm": 0.8760683760683761, "acc_norm_stderr": 0.021586494001281365 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8263090676883781, "acc_stderr": 0.01354741565866226, "acc_norm": 0.8263090676883781, "acc_norm_stderr": 0.01354741565866226 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7341040462427746, "acc_stderr": 0.02378620325550829, "acc_norm": 0.7341040462427746, "acc_norm_stderr": 0.02378620325550829 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.44581005586592176, "acc_stderr": 0.016623998513333106, "acc_norm": 0.44581005586592176, "acc_norm_stderr": 0.016623998513333106 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7254901960784313, "acc_stderr": 0.025553169991826524, "acc_norm": 0.7254901960784313, "acc_norm_stderr": 0.025553169991826524 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7234726688102894, "acc_stderr": 0.025403832978179615, "acc_norm": 0.7234726688102894, "acc_norm_stderr": 0.025403832978179615 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7438271604938271, "acc_stderr": 0.0242885336377261, "acc_norm": 0.7438271604938271, "acc_norm_stderr": 0.0242885336377261 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4929078014184397, "acc_stderr": 0.02982449855912901, "acc_norm": 0.4929078014184397, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4706649282920469, "acc_stderr": 0.012748238397365549, "acc_norm": 0.4706649282920469, "acc_norm_stderr": 0.012748238397365549 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6911764705882353, "acc_stderr": 0.02806499816704009, "acc_norm": 0.6911764705882353, "acc_norm_stderr": 0.02806499816704009 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6764705882352942, "acc_stderr": 0.018926082916083383, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.018926082916083383 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7591836734693878, "acc_stderr": 0.02737294220178816, "acc_norm": 0.7591836734693878, "acc_norm_stderr": 0.02737294220178816 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8258706467661692, "acc_stderr": 0.026814951200421603, "acc_norm": 0.8258706467661692, "acc_norm_stderr": 0.026814951200421603 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.0348735088019777, "acc_norm": 0.86, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.847953216374269, "acc_stderr": 0.027539122889061456, "acc_norm": 0.847953216374269, "acc_norm_stderr": 0.027539122889061456 }, "harness|truthfulqa:mc|0": { "mc1": 0.5826193390452876, "mc1_stderr": 0.017262891063272168, "mc2": 0.7329071890148596, "mc2_stderr": 0.014545063694478095 }, "harness|winogrande|5": { "acc": 0.8342541436464088, "acc_stderr": 0.010450899545370625 }, "harness|gsm8k|5": { "acc": 0.7088703563305534, "acc_stderr": 0.012513215297888465 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
andersonbcdefg/synth_gpt35_tuples_processed
--- dataset_info: features: - name: task dtype: string - name: neg dtype: string - name: query dtype: string - name: pos dtype: string splits: - name: train num_bytes: 220590769 num_examples: 204545 download_size: 123109035 dataset_size: 220590769 configs: - config_name: default data_files: - split: train path: data/train-* ---
vidhikatkoria/SGD_Homes
--- dataset_info: features: - name: domain dtype: string - name: context dtype: string - name: response dtype: string - name: act dtype: int64 - name: speaker dtype: int64 splits: - name: train num_bytes: 2242529.6826529265 num_examples: 7568 - name: test num_bytes: 309 num_examples: 1 download_size: 883348 dataset_size: 2242838.6826529265 --- # Dataset Card for "SGD_Homes" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
lrxzytime/vits2
--- license: apache-2.0 ---
kgr123/quality_counter_1500_4_buckets
--- dataset_info: features: - name: context dtype: string - name: word dtype: string - name: claim dtype: string - name: label dtype: int64 splits: - name: test num_bytes: 8590049 num_examples: 1929 - name: train num_bytes: 8512258 num_examples: 1935 - name: validation num_bytes: 8685197 num_examples: 1941 download_size: 5991498 dataset_size: 25787504 configs: - config_name: default data_files: - split: test path: data/test-* - split: train path: data/train-* - split: validation path: data/validation-* ---
CyberHarem/raphiel_shiraha_ainsworth_gabrieldropout
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of Raphiel Shiraha Ainsworth This is the dataset of Raphiel Shiraha Ainsworth, containing 228 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------| | raw | 228 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 532 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | raw-stage3-eyes | 580 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. | | 384x512 | 228 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x704 | 228 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x880 | 228 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 532 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 532 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-p512-640 | 454 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. | | stage3-eyes-640 | 580 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. | | stage3-eyes-800 | 580 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
zolak/twitter_dataset_80_1713224182
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 148796 num_examples: 390 download_size: 84367 dataset_size: 148796 configs: - config_name: default data_files: - split: train path: data/train-* ---
csujeong/Non_life_insurance
--- language: - ko --- 손해보험 데이터
CyberHarem/henriette_fireemblem
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of henriette (Fire Emblem) This is the dataset of henriette (Fire Emblem), containing 22 images and their tags. The core tags of this character are `blonde_hair, breasts, green_eyes, multicolored_hair, gradient_hair, pink_hair, large_breasts, bangs, hair_ornament, braid, sidelocks`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 22 | 29.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/henriette_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 22 | 15.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/henriette_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 49 | 31.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/henriette_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 22 | 25.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/henriette_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 49 | 45.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/henriette_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/henriette_fireemblem', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------| | 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, smile, looking_at_viewer, blush, circlet, cape, flower, white_dress, jewelry, simple_background | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | looking_at_viewer | blush | circlet | cape | flower | white_dress | jewelry | simple_background | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:--------|:----------|:-------|:---------|:--------------|:----------|:--------------------| | 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X |
Lollitor/PROTEIN
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* dataset_info: features: - name: input dtype: string - name: -logKd/Ki dtype: float64 - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 7899719 num_examples: 11213 - name: validation num_bytes: 878008 num_examples: 1246 download_size: 4294665 dataset_size: 8777727 --- # Dataset Card for "PROTEIN" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mvoisin/TinyCOCO
--- viewer: true dataset_info: features: - name: image_id dtype: int64 - name: image_url dtype: string - name: objects struct: - name: bbox sequence: sequence: float64 - name: category sequence: int64 - name: id sequence: int64 splits: - name: test num_bytes: 754 num_examples: 1 download_size: 0 dataset_size: 754 --- # Dataset Card for "COCO_small" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
alvarobartt/instruction-dataset-notus-7b-v1-inference-endpoints
--- dataset_info: features: - name: instruction dtype: string - name: generation dtype: string - name: model_name dtype: string splits: - name: test num_bytes: 245184 num_examples: 327 download_size: 156529 dataset_size: 245184 configs: - config_name: default data_files: - split: test path: data/test-* ---
laura63/BirdClefTop20
--- dataset_info: features: - name: primary_label dtype: string - name: common_name dtype: string - name: filename dtype: string - name: filepath dtype: string - name: new_filepath dtype: audio: sampling_rate: 16000 - name: label dtype: int64 - name: code dtype: int64 - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 21436542043.0 num_examples: 5868 - name: test num_bytes: 2648516188.0 num_examples: 725 - name: val num_bytes: 2381838007.0 num_examples: 652 download_size: 395191103 dataset_size: 26466896238.0 --- # Dataset Card for "BirdClefTop20" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mserras/alpaca-es-hackaton-validated
--- dataset_info: features: - name: text dtype: 'null' - name: inputs struct: - name: 1-instruction dtype: string - name: 2-input dtype: string - name: 3-output dtype: string - name: prediction dtype: 'null' - name: prediction_agent dtype: 'null' - name: annotation dtype: string - name: annotation_agent dtype: string - name: vectors struct: - name: input sequence: float64 - name: instruction sequence: float64 - name: output sequence: float64 - name: multi_label dtype: bool - name: explanation dtype: 'null' - name: id dtype: string - name: metadata struct: - name: bias_score.label dtype: string - name: bias_score.score dtype: float64 - name: en_index dtype: int64 - name: hate_score.label dtype: string - name: hate_score.score dtype: float64 - name: sf-multi-unprocessable-score dtype: float64 - name: sf-unprocessable-score dtype: float64 - name: tr-flag-1-instruction dtype: bool - name: tr-flag-2-input dtype: bool - name: tr-flag-3-output dtype: bool - name: status dtype: string - name: event_timestamp dtype: timestamp[us] - name: metrics struct: - name: text_length dtype: int64 splits: - name: train num_bytes: 16651162 num_examples: 882 download_size: 0 dataset_size: 16651162 --- # Dataset Card for "alpaca-es-hackaton-validated" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
liuyanchen1015/MULTI_VALUE_sst2_existential_you_have
--- dataset_info: features: - name: sentence dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: score dtype: int64 splits: - name: dev num_bytes: 4813 num_examples: 31 - name: test num_bytes: 6763 num_examples: 45 - name: train num_bytes: 70168 num_examples: 550 download_size: 39808 dataset_size: 81744 --- # Dataset Card for "MULTI_VALUE_sst2_existential_you_have" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
skrishna/heart_disease_uci
--- license: cc-by-4.0 --- # Dataset Card for Dataset Name age: age in years sex: sex (1 = male; 0 = female) cp: chest pain type -- Value 1: typical angina -- Value 2: atypical angina -- Value 3: non-anginal pain -- Value 4: asymptomatic trestbps: resting blood pressure (in mm Hg on admission to the hospital) chol: serum cholestoral in mg/dl fbs: (fasting blood sugar > 120 mg/dl) (1 = true; 0 = false) restecg: resting electrocardiographic results -- Value 0: normal -- Value 1: having ST-T wave abnormality (T wave inversions and/or ST elevation or depression of > 0.05 mV) -- Value 2: showing probable or definite left ventricular hypertrophy by Estes' criteria thalach: maximum heart rate achieved exang: exercise induced angina (1 = yes; 0 = no) oldpeak = ST depression induced by exercise relative to rest slope: the slope of the peak exercise ST segment -- Value 1: upsloping -- Value 2: flat -- Value 3: downsloping ca: number of major vessels (0-3) colored by flourosopy thal: 3 = normal; 6 = fixed defect; 7 = reversable defect target: diagnosis of heart disease (angiographic disease status) -- Value 0: < 50% diameter narrowing -- Value 1: > 50% diameter narrowing (in any major vessel: attributes 59 through 68 are vessels) ## Dataset Description - **Homepage:** https://archive.ics.uci.edu/dataset/45/heart+disease - **Repository:** - **Paper:** - **Leaderboard:** - **Point of Contact:** ### Dataset Summary This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
Krzysko1/komandiero_bombardiero
--- license: cc-by-nc-4.0 ---
CyberHarem/hieda_no_akyuu_touhou
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of hieda_no_akyuu/ひえだのあきゅう/稗田阿求 (Touhou) This is the dataset of hieda_no_akyuu/ひえだのあきゅう/稗田阿求 (Touhou), containing 266 images and their tags. The core tags of this character are `hair_ornament, hair_flower, purple_hair, short_hair, purple_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 266 | 303.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hieda_no_akyuu_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 266 | 214.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hieda_no_akyuu_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 562 | 390.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hieda_no_akyuu_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 266 | 284.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hieda_no_akyuu_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 562 | 478.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hieda_no_akyuu_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/hieda_no_akyuu_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------| | 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, flower, kimono, open_mouth, solo, smile | | 1 | 23 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, flower, kimono, solo, smile, scroll | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, calligraphy_brush, flower, solo, kimono, open_mouth, scroll, smile | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, butterfly, flower, solo, petals, profile, green_kimono | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | flower | kimono | open_mouth | solo | smile | scroll | calligraphy_brush | butterfly | petals | profile | green_kimono | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:---------|:-------------|:-------|:--------|:---------|:--------------------|:------------|:---------|:----------|:---------------| | 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | | | | | | | | 1 | 23 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | X | X | X | | | | | | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | X | X | X | | | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | | X | | | | X | X | X | X |
kunishou/oasst2-chat-68k-ja
--- license: apache-2.0 language: - ja --- [oasst2-135k-ja](https://huggingface.co/datasets/kunishou/oasst2-135k-ja)をチャット形式に変換したデータセットになります。 マルチターン会話でのファインチューニングをする際にご活用下さい(1レコードのトークン長が大きいのでそれなりの計算リソースが必要になります)。 フォーマットは ShareGPT 形式になっています。ファインチューニングをする際は[こちらの記事](https://note.com/npaka/n/n7cbe6f11526c)を参考にして下さい。 OpenAssistant/oasst2 https://huggingface.co/datasets/OpenAssistant/oasst2
anan-2024/twitter_dataset_1713113638
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 121986 num_examples: 314 download_size: 67959 dataset_size: 121986 configs: - config_name: default data_files: - split: train path: data/train-* ---
datahrvoje/twitter_dataset_1713136059
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 24053 num_examples: 59 download_size: 12669 dataset_size: 24053 configs: - config_name: default data_files: - split: train path: data/train-* ---
DigKingy/LamettaFactor
--- license: unknown ---
joey234/mmlu-security_studies-neg
--- dataset_info: features: - name: choices sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question dtype: string splits: - name: test num_bytes: 203688 num_examples: 245 download_size: 113721 dataset_size: 203688 --- # Dataset Card for "mmlu-security_studies-neg" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
maximoss/lingnli-multi-mt
--- license: bsd-2-clause language: - el - fr - it - es - pt - ko - fi - lt - bg task_categories: - text-classification task_ids: - natural-language-inference - multi-input-text-classification size_categories: - 10K<n<100K --- # Dataset Card for Dataset Name ## Dataset Description - **Homepage:** - **Repository:** - **Paper:** - **Leaderboard:** - **Point of Contact:** ### Dataset Summary This repository contains a collection of machine translations of [LingNLI](https://github.com/Alicia-Parrish/ling_in_loop) dataset into 9 different languages (Bulgarian, Finnish, French, Greek, Italian, Korean, Lithuanian, Portuguese, Spanish). The goal is to predict textual entailment (does sentence A imply/contradict/neither sentence B), which is a classification task (given two sentences, predict one of three labels). It is here formatted in the same manner as the widely used [XNLI](https://huggingface.co/datasets/xnli) dataset for convenience. If you want to use this dataset only in a specific language among those provided here, you can filter data by selecting only the language column value you wish. ### Supported Tasks and Leaderboards This dataset can be used for the task of Natural Language Inference (NLI), also known as Recognizing Textual Entailment (RTE), which is a sentence-pair classification task. ## Dataset Structure ### Data Fields - `language`: The language in which the pair of sentences is given. - `premise`: The machine translated premise in the target language. - `hypothesis`: The machine translated premise in the target language. - `label`: The classification label, with possible values 0 (`entailment`), 1 (`neutral`), 2 (`contradiction`). - `label_text`: The classification label, with possible values `entailment` (0), `neutral` (1), `contradiction` (2). - `premise_original`: The original premise from the English source dataset. - `hypothesis_original`: The original hypothesis from the English source dataset. ### Data Splits For the whole dataset (LitL and LotS subsets): | language |train|validation| |-------------|----:|---------:| |all_languages|269865| 44037| |el-gr |29985| 4893| |fr |29985| 4893| |it |29985| 4893| |es |29985| 4893| |pt |29985| 4893| |ko |29985| 4893| |fi |29985| 4893| |lt |29985| 4893| |bg |29985| 4893| For LitL subset: | language |train|validation| |-------------|----:|---------:| |all_languages|134955| 21825| |el-gr |14995| 2425| |fr |14995| 2425| |it |14995| 2425| |es |14995| 2425| |pt |14995| 2425| |ko |14995| 2425| |fi |14995| 2425| |lt |14995| 2425| |bg |14995| 2425| For LotS subset: | language |train|validation| |-------------|----:|---------:| |all_languages|134910| 22212| |el-gr |14990| 2468| |fr |14990| 2468| |it |14990| 2468| |es |14990| 2468| |pt |14990| 2468| |ko |14990| 2468| |fi |14990| 2468| |lt |14990| 2468| |bg |14990| 2468| ## Dataset Creation The two subsets of the original dataset were machine translated using the latest neural machine translation [opus-mt-tc-big](https://huggingface.co/models?sort=downloads&search=opus-mt-tc-big) models available for the respective languages. Running the translations lasted from March 25, 2023 until April 8, 2023. ## Additional Information ### Citation Information **BibTeX:** ````BibTeX @inproceedings{parrish-etal-2021-putting-linguist, title = "Does Putting a Linguist in the Loop Improve {NLU} Data Collection?", author = "Parrish, Alicia and Huang, William and Agha, Omar and Lee, Soo-Hwan and Nangia, Nikita and Warstadt, Alexia and Aggarwal, Karmanya and Allaway, Emily and Linzen, Tal and Bowman, Samuel R.", booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2021", month = nov, year = "2021", address = "Punta Cana, Dominican Republic", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2021.findings-emnlp.421", doi = "10.18653/v1/2021.findings-emnlp.421", pages = "4886--4901", abstract = "Many crowdsourced NLP datasets contain systematic artifacts that are identified only after data collection is complete. Earlier identification of these issues should make it easier to create high-quality training and evaluation data. We attempt this by evaluating protocols in which expert linguists work {`}in the loop{'} during data collection to identify and address these issues by adjusting task instructions and incentives. Using natural language inference as a test case, we compare three data collection protocols: (i) a baseline protocol with no linguist involvement, (ii) a linguist-in-the-loop intervention with iteratively-updated constraints on the writing task, and (iii) an extension that adds direct interaction between linguists and crowdworkers via a chatroom. We find that linguist involvement does not lead to increased accuracy on out-of-domain test sets compared to baseline, and adding a chatroom has no effect on the data. Linguist involvement does, however, lead to more challenging evaluation data and higher accuracy on some challenge sets, demonstrating the benefits of integrating expert analysis during data collection.", } @inproceedings{tiedemann-thottingal-2020-opus, title = "{OPUS}-{MT} {--} Building open translation services for the World", author = {Tiedemann, J{\"o}rg and Thottingal, Santhosh}, booktitle = "Proceedings of the 22nd Annual Conference of the European Association for Machine Translation", month = nov, year = "2020", address = "Lisboa, Portugal", publisher = "European Association for Machine Translation", url = "https://aclanthology.org/2020.eamt-1.61", pages = "479--480", abstract = "This paper presents OPUS-MT a project that focuses on the development of free resources and tools for machine translation. The current status is a repository of over 1,000 pre-trained neural machine translation models that are ready to be launched in on-line translation services. For this we also provide open source implementations of web applications that can run efficiently on average desktop hardware with a straightforward setup and installation.", } ```` **ACL:** Alicia Parrish, William Huang, Omar Agha, Soo-Hwan Lee, Nikita Nangia, Alexia Warstadt, Karmanya Aggarwal, Emily Allaway, Tal Linzen, and Samuel R. Bowman. 2021. [Does Putting a Linguist in the Loop Improve NLU Data Collection?](https://aclanthology.org/2021.findings-emnlp.421). In *Findings of the Association for Computational Linguistics: EMNLP 2021*, pages 4886–4901, Punta Cana, Dominican Republic. Association for Computational Linguistics. Jörg Tiedemann and Santhosh Thottingal. 2020. [OPUS-MT – Building open translation services for the World](https://aclanthology.org/2020.eamt-1.61). In *Proceedings of the 22nd Annual Conference of the European Association for Machine Translation*, pages 479–480, Lisboa, Portugal. European Association for Machine Translation. ### Acknowledgements These translations of the original dataset were done as part of a research project supported by the Defence Innovation Agency (AID) of the Directorate General of Armament (DGA) of the French Ministry of Armed Forces, and by the ICO, _Institut Cybersécurité Occitanie_, funded by Région Occitanie, France.
Svenni551/Invoice
--- dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 1034266.0 num_examples: 10 download_size: 1035462 dataset_size: 1034266.0 configs: - config_name: default data_files: - split: train path: data/train-* ---
biglam/encyclopaedia_britannica_illustrated
--- annotations_creators: - expert-generated language: [] language_creators: [] license: - cc0-1.0 multilinguality: [] pretty_name: Encyclopaedia Britannica Illustrated size_categories: - 1K<n<10K source_datasets: [] tags: [] task_categories: - image-classification task_ids: [] --- # Datastet card for Encyclopaedia Britannica Illustrated ## Table of Contents - [Dataset Card Creation Guide](#dataset-card-creation-guide) - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Initial Data Collection and Normalization](#initial-data-collection-and-normalization) - [Who are the source language producers?](#who-are-the-source-language-producers) - [Annotations](#annotations) - [Annotation process](#annotation-process) - [Who are the annotators?](#who-are-the-annotators) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** [https://data.nls.uk/data/digitised-collections/encyclopaedia-britannica/](https://data.nls.uk/data/digitised-collections/encyclopaedia-britannica/) ### Dataset Summary ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Citation Information ### Contributions Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset.
zen-E/NEWS5M-simcse-roberta-large-embeddings-pca-256
--- task_categories: - sentence-similarity language: - en size_categories: - 1M<n<10M --- A dataset that contains all data in 'ffgcc/NEWS5M' which the corresponding text embedding produced by 'princeton-nlp/unsup-simcse-roberta-large'. The features are transformed to a size of 256 by PCA. The usage: ```python news5M_kd_pca_dataset_unsup = torch.load('./NEWS5M-simcse-roberta-large-embeddings-pca-256/news5M_kd_pca_dataset_unsup.pt') ```
adelavega/dominoes2
--- dataset_info: features: - name: name dtype: string - name: uuid dtype: string - name: status dtype: string - name: image dtype: image - name: label.annotations list: - name: id dtype: int32 - name: category_id dtype: int32 - name: label.segmentation_bitmap dtype: image splits: - name: train num_bytes: 641612760.0 num_examples: 763 download_size: 58051139 dataset_size: 641612760.0 --- # Dataset Card for "dominoes2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
DuongTrongChi/facebook-commet-classification-base
--- dataset_info: features: - name: text dtype: string - name: labels sequence: string splits: - name: train num_bytes: 663401 num_examples: 3967 - name: test num_bytes: 119494 num_examples: 772 - name: dev num_bytes: 52819 num_examples: 330 download_size: 498935 dataset_size: 835714 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* - split: dev path: data/dev-* ---
mextre/frieren
--- license: unknown ---
sproos/scifact-fr
--- configs: - config_name: default data_files: - split: queries path: data/queries-* - split: corpus path: data/corpus-* dataset_info: features: - name: _id dtype: string - name: title dtype: string - name: text dtype: string splits: - name: queries num_bytes: 143388 num_examples: 1109 - name: corpus num_bytes: 9644079 num_examples: 5183 download_size: 78989 dataset_size: 9787467 --- # Dataset Card for "scifact-fr" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Xilabs/instructmix
--- dataset_info: features: - name: output dtype: string - name: instruction dtype: string - name: input dtype: string - name: source dtype: string splits: - name: instructmix_15k num_bytes: 10498076 num_examples: 15000 - name: instructmix_30k num_bytes: 21008700 num_examples: 30000 - name: instructmix_50k num_bytes: 34872601 num_examples: 50000 - name: instructmix_15k_balanced num_bytes: 9550701 num_examples: 15000 - name: instructmix_30k_balanced num_bytes: 19149564 num_examples: 30000 - name: instructmix_all num_bytes: 59355817 num_examples: 87039 download_size: 94447900 dataset_size: 154435459 language: - en tags: - instruction-finetuning pretty_name: InstructMix task_categories: - text-generation size_categories: - 10K<n<100K --- ## Dataset Card for "InstructMix" **Description:** InstructMix is a versatile instruction-tuning dataset available in Alpaca format. It encompasses a variety of instruction-related tasks and sources, making it well suited for finetuning instruction following Large Language Models. #### Included Datasets: | Dataset Name | Size | Type | Details | GitHub Repo | |--------------|----------------|---------------------------------------------------|-----------------------------------------|-------------------------------------------------| | Alpaca_GPT4 | 52,002 examples| General Instruction | Generated by GPT-4 using Alpaca | [GitHub Repo](https://github.com/Instruction-Tuning-with-GPT-4/GPT-4-LLM) | | dolly 2.0 | 15,015 examples| Closed QA, Summarization, etc. (Wikipedia) | Human Annotated | [GitHub Repo](https://github.com/databrickslabs/dolly) | | Code Alpaca | 20,022 examples| Code Generation, Editing, Optimization | Generated by text-davinci-003 | [GitHub Repo](https://github.com/sahil280114/codealpaca) | Credit for the data source: [alpaca-CoT](https://github.com/PhoebusSi/alpaca-CoT) #### Dataset Splits: InstructMix offers several dataset splits, each containing a mix of examples from the mentioned datasets. 1. **instructmix_15k**: 40% Alpaca_GPT4, 40% dolly 2.0, 20% Code Alpaca (15,000 randomly chosen samples according to weightage; in our experience this weightage gives better performance when training LLMs) 2. **instructmix_30k**: 40% Alpaca_GPT4, 40% dolly 2.0, 20% Code Alpaca (30,000 randomly chosen samples according to weightage; in our experience this weightage gives better performance when training LLMs) 3. **instructmix_50k**: 40% Alpaca_GPT4, 40% dolly 2.0, 20% Code Alpaca (50,000 randomly chosen samples according to weightage; in our experience this weightage gives better performance when training LLMs) 4. **instructmix_15k_balanced**: Equal distribution of samples from Alpaca_GPT4, dolly 2.0, and Code Alpaca (15,000 examples) 5. **instructmix_30k_balanced**: Equal distribution of samples from Alpaca_GPT4, dolly 2.0, and Code Alpaca (30,000 examples) 6. **instructmix_all**: All available samples from the mentioned datasets **Models Trained on InstructMix:** - [Xilabs/instructmix-llama-3b](https://huggingface.co/Xilabs/instructmix-llama-3b) **Future Updates:** The InstructMix family of datasets is a rapidly evolving one, with plans to incorporate more curated data for instruction tuning. The creators are currently developing a new InstructMix dataset that will include conversational data.
316usman/thematic4d-pw-embed-part3
--- dataset_info: features: - name: text dtype: string - name: country dtype: string - name: document_url dtype: string - name: source_url dtype: string - name: num_tokens dtype: int64 splits: - name: train num_bytes: 408114269 num_examples: 616322 download_size: 154413613 dataset_size: 408114269 configs: - config_name: default data_files: - split: train path: data/train-* ---
Hack90/ncbi_genbank_part_16
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: id dtype: string - name: sequence dtype: string - name: name dtype: string - name: description dtype: string - name: features dtype: int64 - name: seq_length dtype: int64 splits: - name: train num_bytes: 9781284891 num_examples: 14048187 download_size: 4047367895 dataset_size: 9781284891 --- # Dataset Card for "ncbi_genbank_part_16" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mHossain/final_train_v4_test_560000
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: 'Unnamed: 0' dtype: int64 - name: input_text dtype: string - name: target_text dtype: string - name: prefix dtype: string splits: - name: train num_bytes: 6709050.0 num_examples: 18000 - name: test num_bytes: 745450.0 num_examples: 2000 download_size: 3206263 dataset_size: 7454500.0 --- # Dataset Card for "final_train_v4_test_560000" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
AIRLab-POLIMI/btgenbot
--- license: mit task_categories: - text-generation - robotics language: - en pretty_name: BTGenBot size_categories: - n<1K --- Dataset release for the paper **BTGenBot: Behavior Tree Generation for Robotic Tasks with Lightweight LLMs**, currently in submission at **IEEE/RSJ International Conference on Intelligent Robots and Systems**. [`GitHub Repository`](https://github.com/AIRLab-POLIMI/BTGenBot) [`Paper `](https://arxiv.org/abs/2403.12761)
tasksource/folio
--- license: cc task_categories: - text-classification language: - en task_ids: - natural-language-inference - multi-input-text-classification --- https://github.com/Yale-LILY/FOLIO ``` @article{han2022folio, title={FOLIO: Natural Language Reasoning with First-Order Logic}, author = {Han, Simeng and Schoelkopf, Hailey and Zhao, Yilun and Qi, Zhenting and Riddell, Martin and Benson, Luke and Sun, Lucy and Zubova, Ekaterina and Qiao, Yujie and Burtell, Matthew and Peng, David and Fan, Jonathan and Liu, Yixin and Wong, Brian and Sailor, Malcolm and Ni, Ansong and Nan, Linyong and Kasai, Jungo and Yu, Tao and Zhang, Rui and Joty, Shafiq and Fabbri, Alexander R. and Kryscinski, Wojciech and Lin, Xi Victoria and Xiong, Caiming and Radev, Dragomir}, journal={arXiv preprint arXiv:2209.00840}, url = {https://arxiv.org/abs/2209.00840}, year={2022} } ```
Rimyy/Math-llama2-200k
--- dataset_info: features: - name: question dtype: string - name: answer dtype: string splits: - name: train num_bytes: 225322861 num_examples: 200035 download_size: 84227576 dataset_size: 225322861 configs: - config_name: default data_files: - split: train path: data/train-* ---
chau520/autotrain-data-fine-tune-english-chinese
--- language: - zh - en task_categories: - translation --- # AutoTrain Dataset for project: fine-tune-english-chinese ## Dataset Description This dataset has been automatically processed by AutoTrain for project fine-tune-english-chinese. ### Languages The BCP-47 code for the dataset's language is zh2en. ## Dataset Structure ### Data Instances A sample from this dataset looks as follows: ```json [ { "source": "It is not difficult to hear importing workers in Hong Kong.", "target": "\u5728\u9999\u6e2f\uff0c\u8981\u542c\u5230\u8fdb\u53e3\u5de5\u4eba\u7684\u58f0\u97f3\u5e76\u4e0d\u96be\u3002" }, { "source": "And most of these importing workers are professionals.", "target": "\u800c\u8fd9\u4e9b\u8fdb\u53e3\u5de5\u4eba\u5927\u591a\u662f\u4e13\u4e1a\u4eba\u58eb\u3002" } ] ``` ### Dataset Fields The dataset has the following fields (also called "features"): ```json { "source": "Value(dtype='string', id=None)", "target": "Value(dtype='string', id=None)" } ``` ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow: | Split name | Num samples | | ------------ | ------------------- | | train | 2 | | valid | 1 |
AdapterOcean/med_alpaca_standardized_cluster_15
--- dataset_info: features: - name: text dtype: string - name: conversation_id dtype: int64 - name: embedding sequence: float64 - name: cluster dtype: int64 splits: - name: train num_bytes: 150029254 num_examples: 14740 download_size: 45709470 dataset_size: 150029254 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "med_alpaca_standardized_cluster_15" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_Corianas__Neural-Mistral-7B
--- pretty_name: Evaluation run of Corianas/Neural-Mistral-7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Corianas/Neural-Mistral-7B](https://huggingface.co/Corianas/Neural-Mistral-7B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Corianas__Neural-Mistral-7B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-06T00:13:14.700675](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__Neural-Mistral-7B/blob/main/results_2024-03-06T00-13-14.700675.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6087647678908085,\n\ \ \"acc_stderr\": 0.03306011384756514,\n \"acc_norm\": 0.6137764206014187,\n\ \ \"acc_norm_stderr\": 0.033730529456454196,\n \"mc1\": 0.5471236230110159,\n\ \ \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.692579382132414,\n\ \ \"mc2_stderr\": 0.015033809022649157\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225407,\n\ \ \"acc_norm\": 0.6339590443686007,\n \"acc_norm_stderr\": 0.01407722310847014\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6742680740888269,\n\ \ \"acc_stderr\": 0.0046768988619789115,\n \"acc_norm\": 0.8559051981676957,\n\ \ \"acc_norm_stderr\": 0.0035046810917039014\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\ \ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\ \ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.039105257528497236,\n\ \ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.039105257528497236\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\ \ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \ \ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n\ \ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\ \ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\ \ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \ \ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\ : 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\ \ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\ \ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\ \ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n\ \ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n\ \ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\ \ \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n\ \ \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\ \ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"\ acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\ \ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\ \ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.632258064516129,\n\ \ \"acc_stderr\": 0.02743086657997347,\n \"acc_norm\": 0.632258064516129,\n\ \ \"acc_norm_stderr\": 0.02743086657997347\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\ \ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\ : 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\ \ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124488,\n \"\ acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124488\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306443,\n\ \ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306443\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5641025641025641,\n \"acc_stderr\": 0.025141801511177495,\n\ \ \"acc_norm\": 0.5641025641025641,\n \"acc_norm_stderr\": 0.025141801511177495\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \ \ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \ \ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\ acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8,\n \"acc_stderr\": 0.017149858514250948,\n \"acc_norm\": 0.8,\n\ \ \"acc_norm_stderr\": 0.017149858514250948\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\ : {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n\ \ \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\ acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114969,\n \ \ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114969\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6098654708520179,\n\ \ \"acc_stderr\": 0.03273766725459156,\n \"acc_norm\": 0.6098654708520179,\n\ \ \"acc_norm_stderr\": 0.03273766725459156\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\ \ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\ acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\ \ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\ \ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\ \ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\ \ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\ \ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690879,\n\ \ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690879\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\ \ \"acc_stderr\": 0.02250903393707779,\n \"acc_norm\": 0.8632478632478633,\n\ \ \"acc_norm_stderr\": 0.02250903393707779\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n\ \ \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n\ \ \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n\ \ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30837988826815643,\n\ \ \"acc_stderr\": 0.015445716910998874,\n \"acc_norm\": 0.30837988826815643,\n\ \ \"acc_norm_stderr\": 0.015445716910998874\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.02641560191438899,\n\ \ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.02641560191438899\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\ \ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\ \ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495033,\n\ \ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495033\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \ \ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4348109517601043,\n\ \ \"acc_stderr\": 0.012661233805616302,\n \"acc_norm\": 0.4348109517601043,\n\ \ \"acc_norm_stderr\": 0.012661233805616302\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02952009569768776,\n\ \ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02952009569768776\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6339869281045751,\n \"acc_stderr\": 0.019488025745529675,\n \ \ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.019488025745529675\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\ \ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n\ \ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\ \ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n\ \ \"acc_stderr\": 0.03203841040213322,\n \"acc_norm\": 0.7114427860696517,\n\ \ \"acc_norm_stderr\": 0.03203841040213322\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \ \ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\ \ \"acc_stderr\": 0.03892212195333047,\n \"acc_norm\": 0.4939759036144578,\n\ \ \"acc_norm_stderr\": 0.03892212195333047\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\ \ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5471236230110159,\n\ \ \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.692579382132414,\n\ \ \"mc2_stderr\": 0.015033809022649157\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7742699289660616,\n \"acc_stderr\": 0.011749626260902547\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3752843062926459,\n \ \ \"acc_stderr\": 0.013337170545742934\n }\n}\n```" repo_url: https://huggingface.co/Corianas/Neural-Mistral-7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|arc:challenge|25_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-06T00-13-14.700675.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|gsm8k|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hellaswag|10_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-06T00-13-14.700675.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-management|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-06T00-13-14.700675.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|truthfulqa:mc|0_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-06T00-13-14.700675.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_06T00_13_14.700675 path: - '**/details_harness|winogrande|5_2024-03-06T00-13-14.700675.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-06T00-13-14.700675.parquet' - config_name: results data_files: - split: 2024_03_06T00_13_14.700675 path: - results_2024-03-06T00-13-14.700675.parquet - split: latest path: - results_2024-03-06T00-13-14.700675.parquet --- # Dataset Card for Evaluation run of Corianas/Neural-Mistral-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Corianas/Neural-Mistral-7B](https://huggingface.co/Corianas/Neural-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Corianas__Neural-Mistral-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-06T00:13:14.700675](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__Neural-Mistral-7B/blob/main/results_2024-03-06T00-13-14.700675.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6087647678908085, "acc_stderr": 0.03306011384756514, "acc_norm": 0.6137764206014187, "acc_norm_stderr": 0.033730529456454196, "mc1": 0.5471236230110159, "mc1_stderr": 0.01742558984831402, "mc2": 0.692579382132414, "mc2_stderr": 0.015033809022649157 }, "harness|arc:challenge|25": { "acc": 0.5878839590443686, "acc_stderr": 0.014383915302225407, "acc_norm": 0.6339590443686007, "acc_norm_stderr": 0.01407722310847014 }, "harness|hellaswag|10": { "acc": 0.6742680740888269, "acc_stderr": 0.0046768988619789115, "acc_norm": 0.8559051981676957, "acc_norm_stderr": 0.0035046810917039014 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5703703703703704, "acc_stderr": 0.042763494943765995, "acc_norm": 0.5703703703703704, "acc_norm_stderr": 0.042763494943765995 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6381578947368421, "acc_stderr": 0.039105257528497236, "acc_norm": 0.6381578947368421, "acc_norm_stderr": 0.039105257528497236 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6792452830188679, "acc_stderr": 0.02872750295788027, "acc_norm": 0.6792452830188679, "acc_norm_stderr": 0.02872750295788027 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7013888888888888, "acc_stderr": 0.03827052357950756, "acc_norm": 0.7013888888888888, "acc_norm_stderr": 0.03827052357950756 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5953757225433526, "acc_stderr": 0.03742461193887248, "acc_norm": 0.5953757225433526, "acc_norm_stderr": 0.03742461193887248 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.04913595201274498, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.04913595201274498 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.68, "acc_stderr": 0.04688261722621504, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5361702127659574, "acc_stderr": 0.032600385118357715, "acc_norm": 0.5361702127659574, "acc_norm_stderr": 0.032600385118357715 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.41228070175438597, "acc_stderr": 0.046306532033665956, "acc_norm": 0.41228070175438597, "acc_norm_stderr": 0.046306532033665956 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6137931034482759, "acc_stderr": 0.04057324734419035, "acc_norm": 0.6137931034482759, "acc_norm_stderr": 0.04057324734419035 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3783068783068783, "acc_stderr": 0.024976954053155254, "acc_norm": 0.3783068783068783, "acc_norm_stderr": 0.024976954053155254 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4444444444444444, "acc_stderr": 0.044444444444444495, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.044444444444444495 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.632258064516129, "acc_stderr": 0.02743086657997347, "acc_norm": 0.632258064516129, "acc_norm_stderr": 0.02743086657997347 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5270935960591133, "acc_stderr": 0.03512819077876106, "acc_norm": 0.5270935960591133, "acc_norm_stderr": 0.03512819077876106 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.67, "acc_stderr": 0.04725815626252607, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252607 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7393939393939394, "acc_stderr": 0.034277431758165236, "acc_norm": 0.7393939393939394, "acc_norm_stderr": 0.034277431758165236 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7525252525252525, "acc_stderr": 0.030746300742124488, "acc_norm": 0.7525252525252525, "acc_norm_stderr": 0.030746300742124488 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8549222797927462, "acc_stderr": 0.025416343096306443, "acc_norm": 0.8549222797927462, "acc_norm_stderr": 0.025416343096306443 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5641025641025641, "acc_stderr": 0.025141801511177495, "acc_norm": 0.5641025641025641, "acc_norm_stderr": 0.025141801511177495 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3111111111111111, "acc_stderr": 0.028226446749683512, "acc_norm": 0.3111111111111111, "acc_norm_stderr": 0.028226446749683512 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6680672268907563, "acc_stderr": 0.03058869701378364, "acc_norm": 0.6680672268907563, "acc_norm_stderr": 0.03058869701378364 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8, "acc_stderr": 0.017149858514250948, "acc_norm": 0.8, "acc_norm_stderr": 0.017149858514250948 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.46296296296296297, "acc_stderr": 0.03400603625538271, "acc_norm": 0.46296296296296297, "acc_norm_stderr": 0.03400603625538271 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7696078431372549, "acc_stderr": 0.029554292605695066, "acc_norm": 0.7696078431372549, "acc_norm_stderr": 0.029554292605695066 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.759493670886076, "acc_stderr": 0.02782078198114969, "acc_norm": 0.759493670886076, "acc_norm_stderr": 0.02782078198114969 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6098654708520179, "acc_stderr": 0.03273766725459156, "acc_norm": 0.6098654708520179, "acc_norm_stderr": 0.03273766725459156 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7480916030534351, "acc_stderr": 0.03807387116306085, "acc_norm": 0.7480916030534351, "acc_norm_stderr": 0.03807387116306085 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8099173553719008, "acc_stderr": 0.03581796951709282, "acc_norm": 0.8099173553719008, "acc_norm_stderr": 0.03581796951709282 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7423312883435583, "acc_stderr": 0.03436150827846917, "acc_norm": 0.7423312883435583, "acc_norm_stderr": 0.03436150827846917 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.047184714852195886, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.047184714852195886 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690879, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690879 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8632478632478633, "acc_stderr": 0.02250903393707779, "acc_norm": 0.8632478632478633, "acc_norm_stderr": 0.02250903393707779 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.776500638569604, "acc_stderr": 0.01489723522945071, "acc_norm": 0.776500638569604, "acc_norm_stderr": 0.01489723522945071 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6965317919075145, "acc_stderr": 0.024752411960917205, "acc_norm": 0.6965317919075145, "acc_norm_stderr": 0.024752411960917205 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.30837988826815643, "acc_stderr": 0.015445716910998874, "acc_norm": 0.30837988826815643, "acc_norm_stderr": 0.015445716910998874 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6928104575163399, "acc_stderr": 0.02641560191438899, "acc_norm": 0.6928104575163399, "acc_norm_stderr": 0.02641560191438899 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6977491961414791, "acc_stderr": 0.02608270069539966, "acc_norm": 0.6977491961414791, "acc_norm_stderr": 0.02608270069539966 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7098765432098766, "acc_stderr": 0.025251173936495033, "acc_norm": 0.7098765432098766, "acc_norm_stderr": 0.025251173936495033 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4645390070921986, "acc_stderr": 0.02975238965742705, "acc_norm": 0.4645390070921986, "acc_norm_stderr": 0.02975238965742705 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4348109517601043, "acc_stderr": 0.012661233805616302, "acc_norm": 0.4348109517601043, "acc_norm_stderr": 0.012661233805616302 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6176470588235294, "acc_stderr": 0.02952009569768776, "acc_norm": 0.6176470588235294, "acc_norm_stderr": 0.02952009569768776 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6339869281045751, "acc_stderr": 0.019488025745529675, "acc_norm": 0.6339869281045751, "acc_norm_stderr": 0.019488025745529675 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04265792110940589, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04265792110940589 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7061224489795919, "acc_stderr": 0.02916273841024977, "acc_norm": 0.7061224489795919, "acc_norm_stderr": 0.02916273841024977 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7114427860696517, "acc_stderr": 0.03203841040213322, "acc_norm": 0.7114427860696517, "acc_norm_stderr": 0.03203841040213322 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.03942772444036625, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036625 }, "harness|hendrycksTest-virology|5": { "acc": 0.4939759036144578, "acc_stderr": 0.03892212195333047, "acc_norm": 0.4939759036144578, "acc_norm_stderr": 0.03892212195333047 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.5471236230110159, "mc1_stderr": 0.01742558984831402, "mc2": 0.692579382132414, "mc2_stderr": 0.015033809022649157 }, "harness|winogrande|5": { "acc": 0.7742699289660616, "acc_stderr": 0.011749626260902547 }, "harness|gsm8k|5": { "acc": 0.3752843062926459, "acc_stderr": 0.013337170545742934 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
liuyanchen1015/MULTI_VALUE_mrpc_double_superlative
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: test num_bytes: 23251 num_examples: 82 - name: train num_bytes: 38778 num_examples: 131 - name: validation num_bytes: 6076 num_examples: 20 download_size: 54718 dataset_size: 68105 --- # Dataset Card for "MULTI_VALUE_mrpc_double_superlative" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
NimaBoscarino/fuego-20230224-000744-dd084d
--- tags: - fuego fuego: id: 20230224-000744-dd084d status: done script: train.py requirements_file: requirements.txt space_id: NimaBoscarino/fuego-20230224-000744-dd084d space_hardware: cpu-basic ---
open-llm-leaderboard/details_NovoCode__Mistral-NeuralDPO-v0.3
--- pretty_name: Evaluation run of NovoCode/Mistral-NeuralDPO-v0.3 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [NovoCode/Mistral-NeuralDPO-v0.3](https://huggingface.co/NovoCode/Mistral-NeuralDPO-v0.3)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NovoCode__Mistral-NeuralDPO-v0.3\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-02-19T10:09:09.378755](https://huggingface.co/datasets/open-llm-leaderboard/details_NovoCode__Mistral-NeuralDPO-v0.3/blob/main/results_2024-02-19T10-09-09.378755.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6136008844819947,\n\ \ \"acc_stderr\": 0.03271584502877658,\n \"acc_norm\": 0.61963655051764,\n\ \ \"acc_norm_stderr\": 0.03338202033480734,\n \"mc1\": 0.29498164014687883,\n\ \ \"mc1_stderr\": 0.01596440096558966,\n \"mc2\": 0.4531069456128054,\n\ \ \"mc2_stderr\": 0.01430354313553265\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5750853242320819,\n \"acc_stderr\": 0.014445698968520769,\n\ \ \"acc_norm\": 0.6160409556313993,\n \"acc_norm_stderr\": 0.014212444980651892\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6150169288986258,\n\ \ \"acc_stderr\": 0.004855968578998724,\n \"acc_norm\": 0.8315076677952599,\n\ \ \"acc_norm_stderr\": 0.003735379375255013\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\ \ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\ \ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n\ \ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\ \ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \ \ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.028985455652334395,\n\ \ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.028985455652334395\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\ \ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\ \ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \ \ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\ \ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \ \ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\ \ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\ \ \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.6127167630057804,\n\ \ \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629454,\n\ \ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629454\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\ \ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\ \ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\ \ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\ \ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\ \ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"\ acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\ \ \"acc_stderr\": 0.04240799327574925,\n \"acc_norm\": 0.3412698412698413,\n\ \ \"acc_norm_stderr\": 0.04240799327574925\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n\ \ \"acc_stderr\": 0.024580028921481006,\n \"acc_norm\": 0.7516129032258064,\n\ \ \"acc_norm_stderr\": 0.024580028921481006\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\ \ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\ : 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\ \ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\"\ : 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n\ \ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \ \ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397457,\n\ \ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397457\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.617948717948718,\n \"acc_stderr\": 0.024635549163908234,\n \ \ \"acc_norm\": 0.617948717948718,\n \"acc_norm_stderr\": 0.024635549163908234\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \ \ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.03149930577784906,\n \ \ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.03149930577784906\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\ acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8073394495412844,\n \"acc_stderr\": 0.01690927688493608,\n \"\ acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.01690927688493608\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\ acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"\ acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \ \ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\ \ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n\ \ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728742,\n\ \ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728742\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\ acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\ \ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\ \ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\ \ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\ \ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\ \ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\ \ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\ \ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\ \ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \ \ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7867177522349936,\n\ \ \"acc_stderr\": 0.014648172749593522,\n \"acc_norm\": 0.7867177522349936,\n\ \ \"acc_norm_stderr\": 0.014648172749593522\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.024685316867257796,\n\ \ \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.024685316867257796\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2681564245810056,\n\ \ \"acc_stderr\": 0.014816119635317012,\n \"acc_norm\": 0.2681564245810056,\n\ \ \"acc_norm_stderr\": 0.014816119635317012\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\ \ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\ \ \"acc_stderr\": 0.025494259350694902,\n \"acc_norm\": 0.7202572347266881,\n\ \ \"acc_norm_stderr\": 0.025494259350694902\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.02555765398186806,\n\ \ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.02555765398186806\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \ \ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44784876140808344,\n\ \ \"acc_stderr\": 0.012700582404768224,\n \"acc_norm\": 0.44784876140808344,\n\ \ \"acc_norm_stderr\": 0.012700582404768224\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n\ \ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6617647058823529,\n \"acc_stderr\": 0.01913994374848704,\n \ \ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.01913994374848704\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\ \ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\ \ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n\ \ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\ \ \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n\ \ \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263734,\n \ \ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263734\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\ \ \"acc_stderr\": 0.03878626771002361,\n \"acc_norm\": 0.5421686746987951,\n\ \ \"acc_norm_stderr\": 0.03878626771002361\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\ \ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29498164014687883,\n\ \ \"mc1_stderr\": 0.01596440096558966,\n \"mc2\": 0.4531069456128054,\n\ \ \"mc2_stderr\": 0.01430354313553265\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7797947908445146,\n \"acc_stderr\": 0.011646276755089694\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.34874905231235787,\n \ \ \"acc_stderr\": 0.01312722705503586\n }\n}\n```" repo_url: https://huggingface.co/NovoCode/Mistral-NeuralDPO-v0.3 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|arc:challenge|25_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-02-19T10-09-09.378755.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|gsm8k|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hellaswag|10_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-19T10-09-09.378755.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-management|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-virology|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T10-09-09.378755.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|truthfulqa:mc|0_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-02-19T10-09-09.378755.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_02_19T10_09_09.378755 path: - '**/details_harness|winogrande|5_2024-02-19T10-09-09.378755.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-02-19T10-09-09.378755.parquet' - config_name: results data_files: - split: 2024_02_19T10_09_09.378755 path: - results_2024-02-19T10-09-09.378755.parquet - split: latest path: - results_2024-02-19T10-09-09.378755.parquet --- # Dataset Card for Evaluation run of NovoCode/Mistral-NeuralDPO-v0.3 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [NovoCode/Mistral-NeuralDPO-v0.3](https://huggingface.co/NovoCode/Mistral-NeuralDPO-v0.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_NovoCode__Mistral-NeuralDPO-v0.3", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-19T10:09:09.378755](https://huggingface.co/datasets/open-llm-leaderboard/details_NovoCode__Mistral-NeuralDPO-v0.3/blob/main/results_2024-02-19T10-09-09.378755.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6136008844819947, "acc_stderr": 0.03271584502877658, "acc_norm": 0.61963655051764, "acc_norm_stderr": 0.03338202033480734, "mc1": 0.29498164014687883, "mc1_stderr": 0.01596440096558966, "mc2": 0.4531069456128054, "mc2_stderr": 0.01430354313553265 }, "harness|arc:challenge|25": { "acc": 0.5750853242320819, "acc_stderr": 0.014445698968520769, "acc_norm": 0.6160409556313993, "acc_norm_stderr": 0.014212444980651892 }, "harness|hellaswag|10": { "acc": 0.6150169288986258, "acc_stderr": 0.004855968578998724, "acc_norm": 0.8315076677952599, "acc_norm_stderr": 0.003735379375255013 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5925925925925926, "acc_stderr": 0.04244633238353228, "acc_norm": 0.5925925925925926, "acc_norm_stderr": 0.04244633238353228 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6513157894736842, "acc_stderr": 0.0387813988879761, "acc_norm": 0.6513157894736842, "acc_norm_stderr": 0.0387813988879761 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6679245283018868, "acc_stderr": 0.028985455652334395, "acc_norm": 0.6679245283018868, "acc_norm_stderr": 0.028985455652334395 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7083333333333334, "acc_stderr": 0.038009680605548594, "acc_norm": 0.7083333333333334, "acc_norm_stderr": 0.038009680605548594 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6127167630057804, "acc_stderr": 0.037143259063020656, "acc_norm": 0.6127167630057804, "acc_norm_stderr": 0.037143259063020656 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.35294117647058826, "acc_stderr": 0.047551296160629454, "acc_norm": 0.35294117647058826, "acc_norm_stderr": 0.047551296160629454 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5404255319148936, "acc_stderr": 0.03257901482099835, "acc_norm": 0.5404255319148936, "acc_norm_stderr": 0.03257901482099835 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.45614035087719296, "acc_stderr": 0.04685473041907789, "acc_norm": 0.45614035087719296, "acc_norm_stderr": 0.04685473041907789 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.041227371113703316, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.041227371113703316 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3941798941798942, "acc_stderr": 0.02516798233389414, "acc_norm": 0.3941798941798942, "acc_norm_stderr": 0.02516798233389414 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3412698412698413, "acc_stderr": 0.04240799327574925, "acc_norm": 0.3412698412698413, "acc_norm_stderr": 0.04240799327574925 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7516129032258064, "acc_stderr": 0.024580028921481006, "acc_norm": 0.7516129032258064, "acc_norm_stderr": 0.024580028921481006 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.03317505930009182, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.03317505930009182 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7626262626262627, "acc_stderr": 0.0303137105381989, "acc_norm": 0.7626262626262627, "acc_norm_stderr": 0.0303137105381989 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8393782383419689, "acc_stderr": 0.026499057701397457, "acc_norm": 0.8393782383419689, "acc_norm_stderr": 0.026499057701397457 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.617948717948718, "acc_stderr": 0.024635549163908234, "acc_norm": 0.617948717948718, "acc_norm_stderr": 0.024635549163908234 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.028661201116524575, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.028661201116524575 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6218487394957983, "acc_stderr": 0.03149930577784906, "acc_norm": 0.6218487394957983, "acc_norm_stderr": 0.03149930577784906 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.038227469376587525, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.038227469376587525 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8073394495412844, "acc_stderr": 0.01690927688493608, "acc_norm": 0.8073394495412844, "acc_norm_stderr": 0.01690927688493608 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5324074074074074, "acc_stderr": 0.03402801581358966, "acc_norm": 0.5324074074074074, "acc_norm_stderr": 0.03402801581358966 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7745098039215687, "acc_stderr": 0.02933116229425174, "acc_norm": 0.7745098039215687, "acc_norm_stderr": 0.02933116229425174 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7552742616033755, "acc_stderr": 0.027985699387036423, "acc_norm": 0.7552742616033755, "acc_norm_stderr": 0.027985699387036423 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6681614349775785, "acc_stderr": 0.031602951437766785, "acc_norm": 0.6681614349775785, "acc_norm_stderr": 0.031602951437766785 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7557251908396947, "acc_stderr": 0.03768335959728742, "acc_norm": 0.7557251908396947, "acc_norm_stderr": 0.03768335959728742 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7603305785123967, "acc_stderr": 0.03896878985070417, "acc_norm": 0.7603305785123967, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.03408997886857529, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.03408997886857529 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.39285714285714285, "acc_stderr": 0.04635550135609976, "acc_norm": 0.39285714285714285, "acc_norm_stderr": 0.04635550135609976 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8547008547008547, "acc_stderr": 0.023086635086841407, "acc_norm": 0.8547008547008547, "acc_norm_stderr": 0.023086635086841407 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7867177522349936, "acc_stderr": 0.014648172749593522, "acc_norm": 0.7867177522349936, "acc_norm_stderr": 0.014648172749593522 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6994219653179191, "acc_stderr": 0.024685316867257796, "acc_norm": 0.6994219653179191, "acc_norm_stderr": 0.024685316867257796 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2681564245810056, "acc_stderr": 0.014816119635317012, "acc_norm": 0.2681564245810056, "acc_norm_stderr": 0.014816119635317012 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7222222222222222, "acc_stderr": 0.025646863097137897, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.025646863097137897 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.025494259350694902, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.025494259350694902 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6975308641975309, "acc_stderr": 0.02555765398186806, "acc_norm": 0.6975308641975309, "acc_norm_stderr": 0.02555765398186806 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4787234042553192, "acc_stderr": 0.029800481645628693, "acc_norm": 0.4787234042553192, "acc_norm_stderr": 0.029800481645628693 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.44784876140808344, "acc_stderr": 0.012700582404768224, "acc_norm": 0.44784876140808344, "acc_norm_stderr": 0.012700582404768224 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6397058823529411, "acc_stderr": 0.029163128570670733, "acc_norm": 0.6397058823529411, "acc_norm_stderr": 0.029163128570670733 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6617647058823529, "acc_stderr": 0.01913994374848704, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.01913994374848704 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.04607582090719976, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.04607582090719976 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7183673469387755, "acc_stderr": 0.02879518557429129, "acc_norm": 0.7183673469387755, "acc_norm_stderr": 0.02879518557429129 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8059701492537313, "acc_stderr": 0.027962677604768914, "acc_norm": 0.8059701492537313, "acc_norm_stderr": 0.027962677604768914 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.035887028128263734, "acc_norm": 0.85, "acc_norm_stderr": 0.035887028128263734 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.03878626771002361, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.03878626771002361 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8187134502923976, "acc_stderr": 0.029547741687640038, "acc_norm": 0.8187134502923976, "acc_norm_stderr": 0.029547741687640038 }, "harness|truthfulqa:mc|0": { "mc1": 0.29498164014687883, "mc1_stderr": 0.01596440096558966, "mc2": 0.4531069456128054, "mc2_stderr": 0.01430354313553265 }, "harness|winogrande|5": { "acc": 0.7797947908445146, "acc_stderr": 0.011646276755089694 }, "harness|gsm8k|5": { "acc": 0.34874905231235787, "acc_stderr": 0.01312722705503586 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
tyzhu/random_letter_same_length_find_passage_train50_eval40_rare
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* dataset_info: features: - name: inputs dtype: string - name: targets dtype: string splits: - name: train num_bytes: 43576 num_examples: 140 - name: validation num_bytes: 15550 num_examples: 40 download_size: 39498 dataset_size: 59126 --- # Dataset Card for "random_letter_same_length_find_passage_train50_eval40_rare" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CyberHarem/methode_sousounofrieren
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of Methode/メトーデ (Sousou no Frieren) This is the dataset of Methode/メトーデ (Sousou no Frieren), containing 70 images and their tags. The core tags of this character are `long_hair, brown_hair, breasts, blonde_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 70 | 39.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/methode_sousounofrieren/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 1200 | 70 | 39.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/methode_sousounofrieren/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 108 | 59.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/methode_sousounofrieren/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/methode_sousounofrieren', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, upper_body, closed_mouth, corset, blurry_background, cape, expressionless, dress, from_side, indoors, looking_at_viewer, profile, purple_eyes, underbust | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, blue_eyes, closed_mouth, holding_staff, looking_at_viewer, solo, standing, black_shorts, corset, white_cape, garreg_mach_monastery_uniform, red_pantyhose, shirt, holding_polearm | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | upper_body | closed_mouth | corset | blurry_background | cape | expressionless | dress | from_side | indoors | looking_at_viewer | profile | purple_eyes | underbust | blue_eyes | holding_staff | standing | black_shorts | white_cape | garreg_mach_monastery_uniform | red_pantyhose | shirt | holding_polearm | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------------|:---------------|:---------|:--------------------|:-------|:-----------------|:--------|:------------|:----------|:--------------------|:----------|:--------------|:------------|:------------|:----------------|:-----------|:---------------|:-------------|:--------------------------------|:----------------|:--------|:------------------| | 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | X | | | | | | | X | | | | X | X | X | X | X | X | X | X | X |
ASSERT-KTH/megadiff-single-function
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: diff dtype: string - name: is_single_chunk dtype: bool - name: is_single_function dtype: bool - name: buggy_function dtype: string - name: fixed_function dtype: string splits: - name: train num_bytes: 1624059115.752317 num_examples: 72393 download_size: 546172221 dataset_size: 1624059115.752317 language: - code pretty_name: megadiff size_categories: - 10K<n<100K --- # Megadiff, a dataset of source code changes Contains only single-function diffs. If you use Megadiff, please cite the following technical report: "[Megadiff: A Dataset of 600k Java Source Code Changes Categorized by Diff Size](http://arxiv.org/pdf/2108.04631)". Technical Report 2108.04631, Arxiv; 2021. ``` @techreport{megadiff, TITLE = {{Megadiff: A Dataset of 600k Java Source Code Changes Categorized by Diff Size}}, AUTHOR = {Martin Monperrus and Matias Martinez and He Ye and Fernanda Madeiral and Thomas Durieux and Zhongxing Yu}, URL = {http://arxiv.org/pdf/2108.04631}, INSTITUTION = {Arxiv}, NUMBER = {2108.04631}, YEAR = {2021}, } ```
aisuko/audio_record
--- license: apache-2.0 ---
diegorg151199/adv-ele
--- dataset_info: features: - name: ADV dtype: string - name: ELE dtype: string splits: - name: train num_bytes: 430918.56140350876 num_examples: 1732 - name: test num_bytes: 107978.43859649122 num_examples: 434 download_size: 294301 dataset_size: 538897.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
kunishou/ApolloCorpus-ja
--- license: apache-2.0 language: - ja --- ![Apollo](https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus/resolve/main/assets/apollo_medium_final.png) # ApolloCorpus-ja ## 概要 多言語医療データセットの [ApolloCorpus](https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus) を日本語に自動翻訳した 525k の指示チューニングデータセットになります。 ApolloCorpus は、オープンソースでかつ品質を担保できるデータのみをスクリーニングし収集されたデータセットになります。 詳細は [論文](https://arxiv.org/abs/2403.03640) をご覧下さい。 ## 翻訳対象ファイル データ量が多いのでひとまず以下の 1 ファイルのみを翻訳しました。 なお、英語以外のデータセットについては翻訳品質が低くくなるため、英語データセットのみを日本語に自動翻訳しました(今後、他のファイルを追加で翻訳する場合も英語データのファイルのみを対象にすると思います)。 - medicalPaper_en_qa.json (525k) ## 使用上の注意 多言語データセットを自動翻訳で日本語に翻訳したものであり、翻訳誤りも一部含まれています。 医療領域での LLM に利用する際は十分注意した上で使用して下さい。
open-llm-leaderboard/details_chargoddard__MixtralRPChat-ZLoss
--- pretty_name: Evaluation run of chargoddard/MixtralRPChat-ZLoss dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [chargoddard/MixtralRPChat-ZLoss](https://huggingface.co/chargoddard/MixtralRPChat-ZLoss)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__MixtralRPChat-ZLoss\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-12-24T00:10:11.003805](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__MixtralRPChat-ZLoss/blob/main/results_2023-12-24T00-10-11.003805.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7014965083144905,\n\ \ \"acc_stderr\": 0.030525173302251206,\n \"acc_norm\": 0.7067661366946931,\n\ \ \"acc_norm_stderr\": 0.031115835600048672,\n \"mc1\": 0.386780905752754,\n\ \ \"mc1_stderr\": 0.01704885701051511,\n \"mc2\": 0.5385273808900092,\n\ \ \"mc2_stderr\": 0.015024918935321629\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6510238907849829,\n \"acc_stderr\": 0.013928933461382501,\n\ \ \"acc_norm\": 0.6860068259385665,\n \"acc_norm_stderr\": 0.013562691224726291\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6623182632941645,\n\ \ \"acc_stderr\": 0.004719529099913136,\n \"acc_norm\": 0.8609838677554272,\n\ \ \"acc_norm_stderr\": 0.0034525630964691227\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n\ \ \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.6814814814814815,\n\ \ \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n\ \ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\ \ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \ \ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7924528301886793,\n \"acc_stderr\": 0.024959918028911274,\n\ \ \"acc_norm\": 0.7924528301886793,\n \"acc_norm_stderr\": 0.024959918028911274\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n\ \ \"acc_stderr\": 0.030635578972093274,\n \"acc_norm\": 0.8402777777777778,\n\ \ \"acc_norm_stderr\": 0.030635578972093274\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \ \ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\ \ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \ \ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7109826589595376,\n\ \ \"acc_stderr\": 0.034564257450869995,\n \"acc_norm\": 0.7109826589595376,\n\ \ \"acc_norm_stderr\": 0.034564257450869995\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\ \ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n\ \ \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.6808510638297872,\n \"acc_stderr\": 0.030472973363380045,\n\ \ \"acc_norm\": 0.6808510638297872,\n \"acc_norm_stderr\": 0.030472973363380045\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n\ \ \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n\ \ \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.039966295748767186,\n\ \ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.039966295748767186\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.47619047619047616,\n \"acc_stderr\": 0.025722097064388535,\n \"\ acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.025722097064388535\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n\ \ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n\ \ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \ \ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8387096774193549,\n\ \ \"acc_stderr\": 0.020923327006423298,\n \"acc_norm\": 0.8387096774193549,\n\ \ \"acc_norm_stderr\": 0.020923327006423298\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.625615763546798,\n \"acc_stderr\": 0.03405155380561952,\n\ \ \"acc_norm\": 0.625615763546798,\n \"acc_norm_stderr\": 0.03405155380561952\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\ : 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.0315841532404771,\n\ \ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.0315841532404771\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\ acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078912,\n\ \ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078912\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.7025641025641025,\n \"acc_stderr\": 0.023177408131465942,\n\ \ \"acc_norm\": 0.7025641025641025,\n \"acc_norm_stderr\": 0.023177408131465942\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \ \ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.8109243697478992,\n \"acc_stderr\": 0.025435119438105364,\n\ \ \"acc_norm\": 0.8109243697478992,\n \"acc_norm_stderr\": 0.025435119438105364\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.44370860927152317,\n \"acc_stderr\": 0.04056527902281732,\n \"\ acc_norm\": 0.44370860927152317,\n \"acc_norm_stderr\": 0.04056527902281732\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8788990825688073,\n \"acc_stderr\": 0.013987618292389713,\n \"\ acc_norm\": 0.8788990825688073,\n \"acc_norm_stderr\": 0.013987618292389713\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"\ acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801588,\n \"\ acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801588\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884562,\n \ \ \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884562\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7668161434977578,\n\ \ \"acc_stderr\": 0.028380391147094702,\n \"acc_norm\": 0.7668161434977578,\n\ \ \"acc_norm_stderr\": 0.028380391147094702\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.03278548537343138,\n\ \ \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.03278548537343138\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8677685950413223,\n \"acc_stderr\": 0.030922788320445784,\n \"\ acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.030922788320445784\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\ \ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\ \ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\ \ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\ \ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\ \ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\ \ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n\ \ \"acc_stderr\": 0.017893784904018533,\n \"acc_norm\": 0.9188034188034188,\n\ \ \"acc_norm_stderr\": 0.017893784904018533\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \ \ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8697318007662835,\n\ \ \"acc_stderr\": 0.012036729568216052,\n \"acc_norm\": 0.8697318007662835,\n\ \ \"acc_norm_stderr\": 0.012036729568216052\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7658959537572254,\n \"acc_stderr\": 0.022797110278071134,\n\ \ \"acc_norm\": 0.7658959537572254,\n \"acc_norm_stderr\": 0.022797110278071134\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4569832402234637,\n\ \ \"acc_stderr\": 0.01666049858050917,\n \"acc_norm\": 0.4569832402234637,\n\ \ \"acc_norm_stderr\": 0.01666049858050917\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7973856209150327,\n \"acc_stderr\": 0.02301544687798568,\n\ \ \"acc_norm\": 0.7973856209150327,\n \"acc_norm_stderr\": 0.02301544687798568\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n\ \ \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n\ \ \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.021185893615225174,\n\ \ \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.021185893615225174\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.549645390070922,\n \"acc_stderr\": 0.02968010556502904,\n \ \ \"acc_norm\": 0.549645390070922,\n \"acc_norm_stderr\": 0.02968010556502904\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5130378096479792,\n\ \ \"acc_stderr\": 0.012765893883835332,\n \"acc_norm\": 0.5130378096479792,\n\ \ \"acc_norm_stderr\": 0.012765893883835332\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.025767252010855952,\n\ \ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.025767252010855952\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.7516339869281046,\n \"acc_stderr\": 0.017479487001364764,\n \ \ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.017479487001364764\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\ \ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\ \ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.8,\n \"acc_stderr\": 0.025607375986579157,\n \ \ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.025607375986579157\n \ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\ \ \"acc_stderr\": 0.024112678240900808,\n \"acc_norm\": 0.8656716417910447,\n\ \ \"acc_norm_stderr\": 0.024112678240900808\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \ \ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\ \ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\ \ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n\ \ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.386780905752754,\n\ \ \"mc1_stderr\": 0.01704885701051511,\n \"mc2\": 0.5385273808900092,\n\ \ \"mc2_stderr\": 0.015024918935321629\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8200473559589582,\n \"acc_stderr\": 0.010796468688068682\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5056861258529188,\n \ \ \"acc_stderr\": 0.013771594106283033\n }\n}\n```" repo_url: https://huggingface.co/chargoddard/MixtralRPChat-ZLoss leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|arc:challenge|25_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-12-24T00-10-11.003805.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|gsm8k|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hellaswag|10_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-24T00-10-11.003805.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-management|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-virology|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-24T00-10-11.003805.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|truthfulqa:mc|0_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-12-24T00-10-11.003805.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_12_24T00_10_11.003805 path: - '**/details_harness|winogrande|5_2023-12-24T00-10-11.003805.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-12-24T00-10-11.003805.parquet' - config_name: results data_files: - split: 2023_12_24T00_10_11.003805 path: - results_2023-12-24T00-10-11.003805.parquet - split: latest path: - results_2023-12-24T00-10-11.003805.parquet --- # Dataset Card for Evaluation run of chargoddard/MixtralRPChat-ZLoss <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [chargoddard/MixtralRPChat-ZLoss](https://huggingface.co/chargoddard/MixtralRPChat-ZLoss) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_chargoddard__MixtralRPChat-ZLoss", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-24T00:10:11.003805](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__MixtralRPChat-ZLoss/blob/main/results_2023-12-24T00-10-11.003805.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7014965083144905, "acc_stderr": 0.030525173302251206, "acc_norm": 0.7067661366946931, "acc_norm_stderr": 0.031115835600048672, "mc1": 0.386780905752754, "mc1_stderr": 0.01704885701051511, "mc2": 0.5385273808900092, "mc2_stderr": 0.015024918935321629 }, "harness|arc:challenge|25": { "acc": 0.6510238907849829, "acc_stderr": 0.013928933461382501, "acc_norm": 0.6860068259385665, "acc_norm_stderr": 0.013562691224726291 }, "harness|hellaswag|10": { "acc": 0.6623182632941645, "acc_stderr": 0.004719529099913136, "acc_norm": 0.8609838677554272, "acc_norm_stderr": 0.0034525630964691227 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6814814814814815, "acc_stderr": 0.040247784019771096, "acc_norm": 0.6814814814814815, "acc_norm_stderr": 0.040247784019771096 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7828947368421053, "acc_stderr": 0.03355045304882924, "acc_norm": 0.7828947368421053, "acc_norm_stderr": 0.03355045304882924 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7924528301886793, "acc_stderr": 0.024959918028911274, "acc_norm": 0.7924528301886793, "acc_norm_stderr": 0.024959918028911274 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8402777777777778, "acc_stderr": 0.030635578972093274, "acc_norm": 0.8402777777777778, "acc_norm_stderr": 0.030635578972093274 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7109826589595376, "acc_stderr": 0.034564257450869995, "acc_norm": 0.7109826589595376, "acc_norm_stderr": 0.034564257450869995 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6808510638297872, "acc_stderr": 0.030472973363380045, "acc_norm": 0.6808510638297872, "acc_norm_stderr": 0.030472973363380045 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5964912280701754, "acc_stderr": 0.04615186962583707, "acc_norm": 0.5964912280701754, "acc_norm_stderr": 0.04615186962583707 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6413793103448275, "acc_stderr": 0.039966295748767186, "acc_norm": 0.6413793103448275, "acc_norm_stderr": 0.039966295748767186 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.47619047619047616, "acc_stderr": 0.025722097064388535, "acc_norm": 0.47619047619047616, "acc_norm_stderr": 0.025722097064388535 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5238095238095238, "acc_stderr": 0.04467062628403273, "acc_norm": 0.5238095238095238, "acc_norm_stderr": 0.04467062628403273 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8387096774193549, "acc_stderr": 0.020923327006423298, "acc_norm": 0.8387096774193549, "acc_norm_stderr": 0.020923327006423298 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.625615763546798, "acc_stderr": 0.03405155380561952, "acc_norm": 0.625615763546798, "acc_norm_stderr": 0.03405155380561952 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.793939393939394, "acc_stderr": 0.0315841532404771, "acc_norm": 0.793939393939394, "acc_norm_stderr": 0.0315841532404771 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8636363636363636, "acc_stderr": 0.024450155973189835, "acc_norm": 0.8636363636363636, "acc_norm_stderr": 0.024450155973189835 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9326424870466321, "acc_stderr": 0.018088393839078912, "acc_norm": 0.9326424870466321, "acc_norm_stderr": 0.018088393839078912 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7025641025641025, "acc_stderr": 0.023177408131465942, "acc_norm": 0.7025641025641025, "acc_norm_stderr": 0.023177408131465942 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.028897748741131143, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.028897748741131143 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8109243697478992, "acc_stderr": 0.025435119438105364, "acc_norm": 0.8109243697478992, "acc_norm_stderr": 0.025435119438105364 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.44370860927152317, "acc_stderr": 0.04056527902281732, "acc_norm": 0.44370860927152317, "acc_norm_stderr": 0.04056527902281732 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8788990825688073, "acc_stderr": 0.013987618292389713, "acc_norm": 0.8788990825688073, "acc_norm_stderr": 0.013987618292389713 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5740740740740741, "acc_stderr": 0.03372343271653062, "acc_norm": 0.5740740740740741, "acc_norm_stderr": 0.03372343271653062 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8627450980392157, "acc_stderr": 0.024152225962801588, "acc_norm": 0.8627450980392157, "acc_norm_stderr": 0.024152225962801588 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8860759493670886, "acc_stderr": 0.020681745135884562, "acc_norm": 0.8860759493670886, "acc_norm_stderr": 0.020681745135884562 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7668161434977578, "acc_stderr": 0.028380391147094702, "acc_norm": 0.7668161434977578, "acc_norm_stderr": 0.028380391147094702 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8320610687022901, "acc_stderr": 0.03278548537343138, "acc_norm": 0.8320610687022901, "acc_norm_stderr": 0.03278548537343138 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8677685950413223, "acc_stderr": 0.030922788320445784, "acc_norm": 0.8677685950413223, "acc_norm_stderr": 0.030922788320445784 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8240740740740741, "acc_stderr": 0.036809181416738807, "acc_norm": 0.8240740740740741, "acc_norm_stderr": 0.036809181416738807 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7914110429447853, "acc_stderr": 0.031921934489347235, "acc_norm": 0.7914110429447853, "acc_norm_stderr": 0.031921934489347235 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.8446601941747572, "acc_stderr": 0.03586594738573974, "acc_norm": 0.8446601941747572, "acc_norm_stderr": 0.03586594738573974 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9188034188034188, "acc_stderr": 0.017893784904018533, "acc_norm": 0.9188034188034188, "acc_norm_stderr": 0.017893784904018533 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.78, "acc_stderr": 0.041633319989322626, "acc_norm": 0.78, "acc_norm_stderr": 0.041633319989322626 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8697318007662835, "acc_stderr": 0.012036729568216052, "acc_norm": 0.8697318007662835, "acc_norm_stderr": 0.012036729568216052 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7658959537572254, "acc_stderr": 0.022797110278071134, "acc_norm": 0.7658959537572254, "acc_norm_stderr": 0.022797110278071134 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4569832402234637, "acc_stderr": 0.01666049858050917, "acc_norm": 0.4569832402234637, "acc_norm_stderr": 0.01666049858050917 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7973856209150327, "acc_stderr": 0.02301544687798568, "acc_norm": 0.7973856209150327, "acc_norm_stderr": 0.02301544687798568 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7845659163987139, "acc_stderr": 0.023350225475471442, "acc_norm": 0.7845659163987139, "acc_norm_stderr": 0.023350225475471442 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8240740740740741, "acc_stderr": 0.021185893615225174, "acc_norm": 0.8240740740740741, "acc_norm_stderr": 0.021185893615225174 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.549645390070922, "acc_stderr": 0.02968010556502904, "acc_norm": 0.549645390070922, "acc_norm_stderr": 0.02968010556502904 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5130378096479792, "acc_stderr": 0.012765893883835332, "acc_norm": 0.5130378096479792, "acc_norm_stderr": 0.012765893883835332 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7647058823529411, "acc_stderr": 0.025767252010855952, "acc_norm": 0.7647058823529411, "acc_norm_stderr": 0.025767252010855952 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7516339869281046, "acc_stderr": 0.017479487001364764, "acc_norm": 0.7516339869281046, "acc_norm_stderr": 0.017479487001364764 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.044612721759105085, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.044612721759105085 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8, "acc_stderr": 0.025607375986579157, "acc_norm": 0.8, "acc_norm_stderr": 0.025607375986579157 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8656716417910447, "acc_stderr": 0.024112678240900808, "acc_norm": 0.8656716417910447, "acc_norm_stderr": 0.024112678240900808 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.536144578313253, "acc_stderr": 0.038823108508905954, "acc_norm": 0.536144578313253, "acc_norm_stderr": 0.038823108508905954 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8596491228070176, "acc_stderr": 0.0266405825391332, "acc_norm": 0.8596491228070176, "acc_norm_stderr": 0.0266405825391332 }, "harness|truthfulqa:mc|0": { "mc1": 0.386780905752754, "mc1_stderr": 0.01704885701051511, "mc2": 0.5385273808900092, "mc2_stderr": 0.015024918935321629 }, "harness|winogrande|5": { "acc": 0.8200473559589582, "acc_stderr": 0.010796468688068682 }, "harness|gsm8k|5": { "acc": 0.5056861258529188, "acc_stderr": 0.013771594106283033 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
PleIAs/French-PD-Newspapers
--- task_categories: - text-generation language: - fr tags: - ocr pretty_name: French-Public Domain-Newspapers --- # 🇫🇷 French Public Domain Newspapers 🇫🇷 **French-Public Domain-Newspapers** or **French-PD-Newpapers** is a large collection aiming to agregate all the French newspapers and periodicals in the public domain. The collection has been originally compiled by Pierre-Carl Langlais, on the basis of a large corpus curated by Benoît de Courson, Benjamin Azoulay for [Gallicagram](https://shiny.ens-paris-saclay.fr/app/gallicagram) and in cooperation with OpenLLMFrance. Gallicagram is leading cultural analytics project giving access to word and ngram search on very large cultural heritage datasets in French and other languages. ## Content As of January 2024, the collection contains nearly three million unique newspaper and periodical editions (69,763,525,347 words) from the French National Library (Gallica). Each parquet file has the full text of a few thousand selected at random and, when available, a few core metadatas (Gallica id, title, author, word counts…). The metadata can be easily expanded thanks to the BNF API. This initial agregation was made possible thanks to the open data program of the French National Library and the consolidation of public domain status for cultural heritage works in the EU with the 2019 Copyright Directive (art. 14) The composition of the dataset adheres to the French criteria for public domain of collective works (any publication older than 70 years ago) and individual works (any publication with an author dead for more than 70 years). In agreement with the shorter term rules, the dataset is in the public domain everywhere. ## Uses The primary use of the collection is for cultural analytics project on a wide scale. The collection also aims to expand the availability of open works for the training of Large Language Models. The text can be used for model training and republished without restriction for reproducibility purposes. ## License The entire collection is in the public domain everywhere. This means that the patrimonial rights of each individual or collective rightholders have expired. The French National Library claims additional rights in its terms of use and restrict commercial use: "La réutilisation commerciale de ces contenus est payante et fait l'objet d'une licence. Est entendue par réutilisation commerciale la revente de contenus sous forme de produits élaborés ou de fourniture de service ou toute autre réutilisation des contenus générant directement des revenus." There has been a debate for years in Europe over the definition of public domain and the possibility to restrict its use. Since 2019, the EU Copyright Directive state that "Member States shall provide that, when the term of protection of a work of visual art has expired, any material resulting from an act of reproduction of that work is not subject to copyright or related rights, unless the material resulting from that act of reproduction is original in the sense that it is the author's own intellectual creation."(art. 14) ## Future developments This dataset is not a one time work but will continue to evolve significantly on two directions: * Correction of computer generated errors in the text. All the texts have been transcribed automatically through the use of Optical Character Recognition (OCR) software. The original files have been digitized over a long time period (since the mid-2000s) and some documents should be. Future versions will strive either to re-OCRize the original text or use experimental LLM models for partial OCR correction. * Enhancement of the structure/editorial presentation of the original text. Some parts of the original documents are likely unwanted for large scale analysis or model training (header, page count…). Additionally, some advanced document structures like tables or multi-column layout are unlikely to be well formatted. Major enhancements could be experted through applying new SOTA layout recognition models (like COLAF) on the original PDF files. * Expansion of the collection to other cultural heritage holdings, especially coming from Hathi Trust, Internet Archive and Google Books. ## Acknowledgements The corpus was stored and processed with the generous support of Scaleway. It was built up with the support and concerted efforts of the state start-up LANGU:IA (start-up d’Etat), supported by the French Ministry of Culture and DINUM, as part of the prefiguration of the service offering of the Alliance for Language technologies EDIC (ALT-EDIC). Corpus collection has been largely facilitated thanks to the open science LLM community insights and cooperation (Occiglot, Eleuther AI, Allen AI). <div style="text-align: center;"> <img src="https://github.com/mch-dd/datasetlogo/blob/main/scaleway.jpeg?raw=true" style="width: 33%; margin: 0 auto; display: inline-block;"/> <img src="https://github.com/mch-dd/datasetlogo/blob/main/ministere.png?raw=true" style="width: 33%; margin: 0 auto; display: inline-block;"/> <img src="https://github.com/mch-dd/datasetlogo/blob/main/occiglot.jpg?raw=true" style="width: 33%; margin: 0 auto; display: inline-block;"/> </div>
causalnlp/CLadder
--- configs: - config_name: default data_files: - split: full_v1.5_default path: "data/full_v1.5_default.csv" - split: full_v1 path: "data/full_v1.csv" ---
svyas23/GAMa
--- license: other --- GAMa (Ground-video to Aerial-image Matching) dataset Download at: https://www.crcv.ucf.edu/data1/GAMa/ # GAMa: Cross-view Video Geo-localization by [Shruti Vyas](https://scholar.google.com/citations?user=15YqUQUAAAAJ&hl=en); [Chen Chen](https://scholar.google.com/citations?user=TuEwcZ0AAAAJ&hl=en); [Mubarak Shah](https://scholar.google.com/citations?user=p8gsO3gAAAAJ&hl=en) code at: https://github.com/svyas23/GAMa/blob/main/README.md
tyzhu/find_second_sent_train_50_eval_10_hint5
--- dataset_info: features: - name: inputs dtype: string - name: targets dtype: string - name: title dtype: string - name: context dtype: string splits: - name: train num_bytes: 135743 num_examples: 110 - name: validation num_bytes: 9461 num_examples: 10 download_size: 82208 dataset_size: 145204 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* --- # Dataset Card for "find_second_sent_train_50_eval_10_hint5" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
DevilCaos/lucas
--- license: unknown ---
declare-lab/GSM8k_MORE
--- license: apache-2.0 task_categories: - text2text-generation - question-answering - text-generation language: - en size_categories: - n<1K --- Dataset introduced in the paper _Stuck in the Quicksand of Numeracy, Far from AGI Summit: Evaluating LLMs' Mathematical Competency through Ontology-guided Perturbations_. This dataset was created by randomly sampling five questions from GSM8K and perturbing them using an ontology. <img src="https://raw.githubusercontent.com/declare-lab/llm_robustness/9a358fc0a331b63ffa3047fb3907dd92abd85b0a/assets/ontology_uni.png" alt="Image" width="800" height="800"> # Performance of LLMs on MORE | Domain | Original | Logic Alteration | | | | Avg. | Concept Analysis | | | Avg. | Format Change | | Avg. | Form. Constraint | Weighted Avg. | |--------|----------|------------------|---|---|---|------|------------------|---|---|------|---------------|---|------|------------------|---------------| | Dimension | | Quest. Simpl. | Reason Adjust. | Compute. Adjust. | Symbol Manip. | Perf. | Quest. Under. | Sol. Eval. | Error Debug | Perf. | Alt. Format | Pair. Comp. | Perf. | Answer Constraint | | | **GPT-4** | 100 | 100 | 80 | 90.91 | 60 | 78.30 | 85 | 65 | 48 | 64.62 | 90 | 60 | 84.00 | 65 | 74.21 | | **GPT-3.5** | 80 | 75 | 27.5 | 54.55 | 25.71 | 38.68 | 55 | 45 | 12 | 35.38 | 35 | 40 | 36.00 | 5 | 35.75 | | **Gemini** | 80 | 90 | 50 | 81.82 | 37.14 | 56.60 | 60 | 20 | 16 | 30.77 | 55 | 20 | 48.00 | 30 | 46.15 | | **Llama2-Chat** | 60 | 50 | 12.5 | 18.18 | 5.71 | 17.92 | 35 | 60 | 4 | 30.77 | 5 | 60 | 16.00 | 5 | 26.24 | | **Metamath** | 80 | 70 | 15 | 27.27 | 11.43 | 25.47 | 30 | 25 | 4 | 18.46 | 35 | 80 | 44.00 | 20 | 21.27 | | **Average** | 80 | 77 | 37 | 54.55 | 27.90 | 43.39 | 53 | 43 | 16.8 | 36.00 | 44 | 52 | 45.60 | 25 | 40.72 |
kevinlan888/test_data
--- task_categories: - question-answering language: - zh size_categories: - n<1K ---
theofcks/Matue
--- license: openrail ---
Drozdik/tattoo_v1
--- dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 101532798.169 num_examples: 4239 download_size: 78733652 dataset_size: 101532798.169 --- # Dataset Card for "tattoo_v1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
hemachandher/sql-license
--- license: apache-2.0 ---
arabic_speech_corpus
--- pretty_name: Arabic Speech Corpus annotations_creators: - expert-generated language_creators: - crowdsourced language: - ar license: - cc-by-4.0 multilinguality: - monolingual paperswithcode_id: arabic-speech-corpus size_categories: - 1K<n<10K source_datasets: - original task_categories: - automatic-speech-recognition task_ids: [] train-eval-index: - config: clean task: automatic-speech-recognition task_id: speech_recognition splits: train_split: train eval_split: test col_mapping: file: path text: text metrics: - type: wer name: WER - type: cer name: CER dataset_info: features: - name: file dtype: string - name: text dtype: string - name: audio dtype: audio: sampling_rate: 48000 - name: phonetic dtype: string - name: orthographic dtype: string config_name: clean splits: - name: train num_bytes: 1002365 num_examples: 1813 - name: test num_bytes: 65784 num_examples: 100 download_size: 1192302846 dataset_size: 1068149 --- # Dataset Card for Arabic Speech Corpus ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** [Arabic Speech Corpus](http://en.arabicspeechcorpus.com/) - **Repository:** [Needs More Information] - **Paper:** [Modern standard Arabic phonetics for speech synthesis](http://en.arabicspeechcorpus.com/Nawar%20Halabi%20PhD%20Thesis%20Revised.pdf) - **Leaderboard:** [Paperswithcode Leaderboard][Needs More Information] - **Point of Contact:** [Nawar Halabi](mailto:nawar.halabi@gmail.com) ### Dataset Summary This Speech corpus has been developed as part of PhD work carried out by Nawar Halabi at the University of Southampton. The corpus was recorded in south Levantine Arabic (Damascian accent) using a professional studio. Synthesized speech as an output using this corpus has produced a high quality, natural voice. ### Supported Tasks and Leaderboards [Needs More Information] ### Languages The audio is in Arabic. ## Dataset Structure ### Data Instances A typical data point comprises the path to the audio file, usually called `file` and its transcription, called `text`. An example from the dataset is: ``` { 'file': '/Users/username/.cache/huggingface/datasets/downloads/extracted/baebe85e2cb67579f6f88e7117a87888c1ace390f4f14cb6c3e585c517ad9db0/arabic-speech-corpus/wav/ARA NORM 0002.wav', 'audio': {'path': '/Users/username/.cache/huggingface/datasets/downloads/extracted/baebe85e2cb67579f6f88e7117a87888c1ace390f4f14cb6c3e585c517ad9db0/arabic-speech-corpus/wav/ARA NORM 0002.wav', 'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32), 'sampling_rate': 48000}, 'orthographic': 'waraj~aHa Alt~aqoriyru Al~a*iy >aEad~ahu maEohadu >aboHaA^i haDabapi Alt~ibiti fiy Alo>akaAdiymiy~api AlS~iyniy~api liloEuluwmi - >ano tasotamir~a darajaAtu AloHaraArapi wamusotawayaAtu Alr~uTuwbapi fiy Alo<irotifaAEi TawaAla ha*aA Aloqarono', 'phonetic': "sil w a r a' jj A H a tt A q r ii0' r u0 ll a * i0 < a E a' dd a h u0 m a' E h a d u0 < a b H aa' ^ i0 h A D A' b a t i0 tt i1' b t i0 f i0 l < a k aa d ii0 m ii0' y a t i0 SS II0 n ii0' y a t i0 l u0 l E u0 l uu0' m i0 sil < a' n t a s t a m i0' rr a d a r a j aa' t u0 l H a r aa' r a t i0 w a m u0 s t a w a y aa' t u0 rr U0 T UU0' b a t i0 f i0 l Ah i0 r t i0 f aa' E i0 T A' w A l a h aa' * a l q A' r n sil", 'text': '\ufeffwaraj~aHa Alt~aqoriyru Al~aTHiy >aEad~ahu maEohadu >aboHaA^i haDabapi Alt~ibiti fiy Alo>akaAdiymiy~api AlS~iyniy~api liloEuluwmi - >ano tasotamir~a darajaAtu AloHaraArapi wamusotawayaAtu Alr~uTuwbapi fiy Alo<irotifaAEi TawaAla haTHaA Aloqarono' } ``` ### Data Fields - file: A path to the downloaded audio file in .wav format. - audio: A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`. - text: the transcription of the audio file. - phonetic: the transcription in phonentics format. - orthographic: the transcriptions written in orthographic format. ### Data Splits | | Train | Test | | ----- | ----- | ---- | | dataset | 1813 | 100 | ## Dataset Creation ### Curation Rationale The corpus was created with Speech Synthesis as the main application in mind. Although it has been used as part of a larger corpus for speech recognition and speech denoising. Here are some explanations why the corpus was built the way it is: * Corpus size: Budget limitations and the research goal resulted in the decision not to gather more data. The goal was to show that high quality speech synthesis is possible with smaller corpora. * Phonetic diversity: Just like with many corpora, the phonetic diversity was acheived using greedy methods. Start with a core set of utterances and add more utterances which contribute to adding more phonetic diversity the most iterativly. The measure of diversity is based on the diphone frequency. * Content: News, sports, economics, fully diacritised content from the internet was gathered. The choice of utterances was random to avoid copyright issues. Because of corpus size, acheiving diversity of content type was difficult and was not the goal. * Non-sense utterances: The corpus contains a large set of utterances that are generated computationally to compensate for the diphones missing in the main part of the corpus. The usefullness of non-sense utterances was not proven in the PhD thesis. * The talent: The voice talent had a Syrian dialect from Damascus and spoke in formal Arabic. Please refer to [PhD thesis](#Citation-Information) for more detailed information. ### Source Data #### Initial Data Collection and Normalization News, sports, economics, fully diacritised content from the internet was gathered. The choice of utterances was random to avoid copyright issues. Because of corpus size, acheiving diversity of content type was difficult and was not the goal. We were restricted to content which was fully diacritised to make the annotation process easier. Just like with many corpora, the phonetic diversity was acheived using greedy methods. Start with a core set of utterances and add more utterances which contribute to adding more phonetic diversity the most iterativly. The measure of diversity is based on the diphone frequency. Please refer to [PhD thesis](#Citation-Information). #### Who are the source language producers? Please refer to [PhD thesis](#Citation-Information). ### Annotations #### Annotation process Three annotators aligned audio with phonemes with the help of HTK forced alignment. They worked on overlapping parts as well to assess annotator agreement and the quality of the annotations. The entire corpus was checked by human annotators. Please refer to [PhD thesis](#Citation-Information). #### Who are the annotators? Nawar Halabi and two anonymous Arabic language teachers. ### Personal and Sensitive Information The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in this dataset. The voice talent agreed in writing for their voice to be used in speech technologies as long as they stay anonymous. ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [Needs More Information] ## Additional Information ### Dataset Curators The corpus was recorded in south Levantine Arabic (Damascian accent) using a professional studio by Nawar Halabi. ### Licensing Information [CC BY 4.0](https://creativecommons.org/licenses/by/4.0/) ### Citation Information ``` @phdthesis{halabi2016modern, title={Modern standard Arabic phonetics for speech synthesis}, author={Halabi, Nawar}, year={2016}, school={University of Southampton} } ``` ### Contributions This dataset was created by: * Nawar Halabi [@nawarhalabi](https://github.com/nawarhalabi) main creator and annotator. * Two anonymous Arabic langauge teachers as annotators. * One anonymous voice talent. * Thanks to [@zaidalyafeai](https://github.com/zaidalyafeai) for adding this dataset.
laion/laion-pop
Invalid username or password.
mehdie/sefaria
--- license: cc-by-4.0 language: - he - en tags: - History - Rabbinic pretty_name: Sefaria HF Dataset --- This Dataset is a Hugging Face interface to the [Sefaria database export](https://github.com/Sefaria/Sefaria-Export) Sefaria is a large collection of early Jewish texts, mostly in ancient Hebrew, but also some are in Aramaic, and some are translations into English.
williamlee/test2
--- license: apache-2.0 ---
KevinZ/psycholinguistic_eval
--- annotations_creators: - expert-generated language_creators: - expert-generated language: - en-US license: - mit multilinguality: - monolingual pretty_name: psycholinguistic_eval size_categories: - n<1K source_datasets: [] task_categories: - multiple-choice - fill-mask - question-answering - zero-shot-classification task_ids: [] --- This is a suite of psycholinguistic datasets by Allyson Ettinger. See her [official Github repository](https://github.com/aetting/lm-diagnostics) for specific details.
Justiceak/yellow-book
--- license: mit ---
yoshitomo-matsubara/mu-mimo
--- license: cdla-permissive-2.0 pretty_name: mu_mimo size_categories: - 100K<n<1M --- # MU-MIMO datasets This is the official repository of MU-MIMO datasets used in "SplitBeam: Effective and Efficient Beamforming in Wi-Fi Networks Through Split Computing" (ICDCS 2023). `*-h_mat.npy` and `*-v_mat.npy` are input samples and targets, respectively. If you have any questions about the datasets, please directly contact [`Niloofar Bahadori`](https://niloobahadori.github.io/) as she built both the real and synthetic datasets. The code is available [here](https://github.com/yoshitomo-matsubara/split-beam). ## Citation ```bibtex @inproceedings{bahadori2023splitbeam, title={{SplitBeam: Effective and Efficient Beamforming in Wi-Fi Networks Through Split Computing}}, author={Bahadori, Niloofar and Matsubara, Yoshitomo and Levorato, Marco and Restuccia, Francesco}, booktitle={2023 IEEE 43rd International Conference on Distributed Computing Systems (ICDCS)}, pages={864--874}, year={2023}, organization={IEEE} } ```