id
stringlengths
2
115
lastModified
stringlengths
24
24
tags
list
author
stringlengths
2
42
description
stringlengths
0
68.7k
citation
stringlengths
0
10.7k
cardData
null
likes
int64
0
3.55k
downloads
int64
0
10.1M
card
stringlengths
0
1.01M
wilayna/support
2023-10-11T00:08:17.000Z
[ "region:us" ]
wilayna
null
null
null
0
0
Entry not found
W1lson/Book3
2023-10-11T00:09:22.000Z
[ "region:us" ]
W1lson
null
null
null
0
0
--- dataset_info: features: - name: Requirement ID dtype: string - name: ' Requirement Description' dtype: string splits: - name: train num_bytes: 6975 num_examples: 100 download_size: 4920 dataset_size: 6975 --- # Dataset Card for "Book3" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
kimsiun/clinical_trial_eligibility_crietria_recommendation
2023-10-11T01:20:38.000Z
[ "license:mit", "region:us" ]
kimsiun
null
null
null
0
0
--- license: mit --- This repository is a public repository of the data used in the paper "CReSE: Enhancing Clinical Trial Design via Contrastive Learning and Rephrasing-based and Clinical Relevance-preserving Sentence Embedding" (under review). There are three main types of data stored in the repository. 1) Positive-negative EC-title pairs: A dataset that pairs the ECs used in a study with the study's title and other design information. It can be used to train EC recommendation models (binary classification). Different datasets are available in terms of the input type of trial information and the number of ECs in the trial. - For example, a file named "train_pairs_positive_inputtype_only_title.p" means positive pair data collected using only trial title as the input type. - On the other hand, the file "train_pairs_negative_Ent8_inputtype_title+CTinfo.p" refers to negative pair data collected using trial title and semi-structured key design factors as input type, for only trials with EC numbers of 8 or more reported through clinicaltrials.gov. 2) original-rephrased EC pairs: The original-rephrased EC pairs data used to develop the CReSE model. EC rephrasing was performed using ChatGPT (gpt-3.5-turbo). 3) Clinical relevance data between EC pairs: A dataset evaluating the clinical relevance between different ECs created to evaluate the EC clustering performance of the CReSE model. It was also created using ChatGPT (gpt-3.5-turbo). Please refer to our paper for more specific data generation conditions and related prompts.
natnitaract/kaggel-llm-science-exam-2023-RAG
2023-10-11T00:51:23.000Z
[ "license:apache-2.0", "region:us" ]
natnitaract
null
null
null
0
0
--- license: apache-2.0 ---
BubbleJoe/scitail_unified_input
2023-10-11T00:52:23.000Z
[ "region:us" ]
BubbleJoe
null
null
null
0
0
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* - split: validation path: data/validation-* dataset_info: features: - name: sentence1_binary_parse dtype: string - name: sentence1_parse dtype: string - name: sentence1 dtype: string - name: sentence2_parse dtype: string - name: sentence2 dtype: string - name: annotator_labels sequence: string - name: gold_label dtype: string - name: input dtype: string - name: label dtype: int64 splits: - name: train num_bytes: 27422381 num_examples: 23596 - name: test num_bytes: 2447299 num_examples: 2126 - name: validation num_bytes: 1544360 num_examples: 1304 download_size: 9513186 dataset_size: 31414040 --- # Dataset Card for "scitail_unified_input" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Agtian/abv
2023-10-11T01:10:06.000Z
[ "region:us" ]
Agtian
null
null
null
0
0
Entry not found
walsenjond/ansawn
2023-10-11T01:02:38.000Z
[ "region:us" ]
walsenjond
null
null
null
0
0
Entry not found
milkshake721/17k-scienceQA
2023-10-11T01:06:01.000Z
[ "license:apache-2.0", "region:us" ]
milkshake721
null
null
null
0
0
--- license: apache-2.0 ---
open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v3
2023-10-11T01:17:57.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-7B-v3 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Severian/ANIMA-Phi-Neptune-Mistral-7B-v3](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v3)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v3\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-11T01:16:32.937269](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v3/blob/main/results_2023-10-11T01-16-32.937269.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5392786633208031,\n\ \ \"acc_stderr\": 0.03494779312823446,\n \"acc_norm\": 0.5431264898387953,\n\ \ \"acc_norm_stderr\": 0.03493217150757376,\n \"mc1\": 0.412484700122399,\n\ \ \"mc1_stderr\": 0.01723329939957122,\n \"mc2\": 0.5940288949043588,\n\ \ \"mc2_stderr\": 0.015208554054531144\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5401023890784983,\n \"acc_stderr\": 0.01456431885692485,\n\ \ \"acc_norm\": 0.568259385665529,\n \"acc_norm_stderr\": 0.014474591427196202\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5893248356901015,\n\ \ \"acc_stderr\": 0.004909509538525159,\n \"acc_norm\": 0.7881896036646087,\n\ \ \"acc_norm_stderr\": 0.004077561349272391\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\ \ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\ \ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n\ \ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\ \ \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \ \ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.0295822451283843,\n\ \ \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.0295822451283843\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n\ \ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.5347222222222222,\n\ \ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \ \ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\ \ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\ \ \"acc_stderr\": 0.037724468575180276,\n \"acc_norm\": 0.5722543352601156,\n\ \ \"acc_norm_stderr\": 0.037724468575180276\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077615,\n\ \ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077615\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n\ \ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\ \ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\ \ \"acc_stderr\": 0.04579639422070435,\n \"acc_norm\": 0.38596491228070173,\n\ \ \"acc_norm_stderr\": 0.04579639422070435\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\ \ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596433,\n \"\ acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596433\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\ \ \"acc_stderr\": 0.0436031486007746,\n \"acc_norm\": 0.3888888888888889,\n\ \ \"acc_norm_stderr\": 0.0436031486007746\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6161290322580645,\n\ \ \"acc_stderr\": 0.02766618207553965,\n \"acc_norm\": 0.6161290322580645,\n\ \ \"acc_norm_stderr\": 0.02766618207553965\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561952,\n\ \ \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561952\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\ : 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\ \ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.6565656565656566,\n \"acc_stderr\": 0.03383201223244441,\n \"\ acc_norm\": 0.6565656565656566,\n \"acc_norm_stderr\": 0.03383201223244441\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7357512953367875,\n \"acc_stderr\": 0.03182155050916645,\n\ \ \"acc_norm\": 0.7357512953367875,\n \"acc_norm_stderr\": 0.03182155050916645\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.47692307692307695,\n \"acc_stderr\": 0.025323990861736118,\n\ \ \"acc_norm\": 0.47692307692307695,\n \"acc_norm_stderr\": 0.025323990861736118\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871937,\n \ \ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871937\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.4957983193277311,\n \"acc_stderr\": 0.0324773433444811,\n \ \ \"acc_norm\": 0.4957983193277311,\n \"acc_norm_stderr\": 0.0324773433444811\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\ acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7229357798165138,\n \"acc_stderr\": 0.01918848259016953,\n \"\ acc_norm\": 0.7229357798165138,\n \"acc_norm_stderr\": 0.01918848259016953\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"\ acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.6519607843137255,\n \"acc_stderr\": 0.03343311240488418,\n \"\ acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.03343311240488418\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7088607594936709,\n \"acc_stderr\": 0.02957160106575337,\n \ \ \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.02957160106575337\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6098654708520179,\n\ \ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.6098654708520179,\n\ \ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.0426073515764456,\n\ \ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.0426073515764456\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\ : 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6111111111111112,\n\ \ \"acc_stderr\": 0.04712821257426769,\n \"acc_norm\": 0.6111111111111112,\n\ \ \"acc_norm_stderr\": 0.04712821257426769\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6196319018404908,\n \"acc_stderr\": 0.038142698932618374,\n\ \ \"acc_norm\": 0.6196319018404908,\n \"acc_norm_stderr\": 0.038142698932618374\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\ \ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\ \ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326468,\n\ \ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326468\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\ \ \"acc_stderr\": 0.024662496845209818,\n \"acc_norm\": 0.8290598290598291,\n\ \ \"acc_norm_stderr\": 0.024662496845209818\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \ \ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7088122605363985,\n\ \ \"acc_stderr\": 0.016246087069701407,\n \"acc_norm\": 0.7088122605363985,\n\ \ \"acc_norm_stderr\": 0.016246087069701407\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.02668013476167922,\n\ \ \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.02668013476167922\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34972067039106147,\n\ \ \"acc_stderr\": 0.015949308790233638,\n \"acc_norm\": 0.34972067039106147,\n\ \ \"acc_norm_stderr\": 0.015949308790233638\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.028358956313423545,\n\ \ \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.028358956313423545\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\ \ \"acc_stderr\": 0.027417996705630998,\n \"acc_norm\": 0.6302250803858521,\n\ \ \"acc_norm_stderr\": 0.027417996705630998\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.027237415094592474,\n\ \ \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.027237415094592474\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251455,\n \ \ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251455\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38722294654498046,\n\ \ \"acc_stderr\": 0.012441155326854927,\n \"acc_norm\": 0.38722294654498046,\n\ \ \"acc_norm_stderr\": 0.012441155326854927\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904611,\n\ \ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904611\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5359477124183006,\n \"acc_stderr\": 0.02017548876548405,\n \ \ \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.02017548876548405\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\ \ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\ \ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726492,\n\ \ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726492\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\ \ \"acc_stderr\": 0.03187187537919796,\n \"acc_norm\": 0.7164179104477612,\n\ \ \"acc_norm_stderr\": 0.03187187537919796\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\ \ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.4578313253012048,\n\ \ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.034240429246915824,\n\ \ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.034240429246915824\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.412484700122399,\n\ \ \"mc1_stderr\": 0.01723329939957122,\n \"mc2\": 0.5940288949043588,\n\ \ \"mc2_stderr\": 0.015208554054531144\n }\n}\n```" repo_url: https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v3 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|arc:challenge|25_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hellaswag|10_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-11T01-16-32.937269.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-management|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T01-16-32.937269.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_11T01_16_32.937269 path: - '**/details_harness|truthfulqa:mc|0_2023-10-11T01-16-32.937269.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-11T01-16-32.937269.parquet' - config_name: results data_files: - split: 2023_10_11T01_16_32.937269 path: - results_2023-10-11T01-16-32.937269.parquet - split: latest path: - results_2023-10-11T01-16-32.937269.parquet --- # Dataset Card for Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-7B-v3 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v3 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Severian/ANIMA-Phi-Neptune-Mistral-7B-v3](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v3", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-11T01:16:32.937269](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v3/blob/main/results_2023-10-11T01-16-32.937269.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5392786633208031, "acc_stderr": 0.03494779312823446, "acc_norm": 0.5431264898387953, "acc_norm_stderr": 0.03493217150757376, "mc1": 0.412484700122399, "mc1_stderr": 0.01723329939957122, "mc2": 0.5940288949043588, "mc2_stderr": 0.015208554054531144 }, "harness|arc:challenge|25": { "acc": 0.5401023890784983, "acc_stderr": 0.01456431885692485, "acc_norm": 0.568259385665529, "acc_norm_stderr": 0.014474591427196202 }, "harness|hellaswag|10": { "acc": 0.5893248356901015, "acc_stderr": 0.004909509538525159, "acc_norm": 0.7881896036646087, "acc_norm_stderr": 0.004077561349272391 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4666666666666667, "acc_stderr": 0.043097329010363554, "acc_norm": 0.4666666666666667, "acc_norm_stderr": 0.043097329010363554 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5263157894736842, "acc_stderr": 0.04063302731486671, "acc_norm": 0.5263157894736842, "acc_norm_stderr": 0.04063302731486671 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.55, "acc_stderr": 0.04999999999999999, "acc_norm": 0.55, "acc_norm_stderr": 0.04999999999999999 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6377358490566037, "acc_stderr": 0.0295822451283843, "acc_norm": 0.6377358490566037, "acc_norm_stderr": 0.0295822451283843 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5347222222222222, "acc_stderr": 0.04171115858181618, "acc_norm": 0.5347222222222222, "acc_norm_stderr": 0.04171115858181618 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5722543352601156, "acc_stderr": 0.037724468575180276, "acc_norm": 0.5722543352601156, "acc_norm_stderr": 0.037724468575180276 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.30392156862745096, "acc_stderr": 0.045766654032077615, "acc_norm": 0.30392156862745096, "acc_norm_stderr": 0.045766654032077615 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4425531914893617, "acc_stderr": 0.03246956919789958, "acc_norm": 0.4425531914893617, "acc_norm_stderr": 0.03246956919789958 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.38596491228070173, "acc_stderr": 0.04579639422070435, "acc_norm": 0.38596491228070173, "acc_norm_stderr": 0.04579639422070435 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3439153439153439, "acc_stderr": 0.024464426625596433, "acc_norm": 0.3439153439153439, "acc_norm_stderr": 0.024464426625596433 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3888888888888889, "acc_stderr": 0.0436031486007746, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.0436031486007746 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6161290322580645, "acc_stderr": 0.02766618207553965, "acc_norm": 0.6161290322580645, "acc_norm_stderr": 0.02766618207553965 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.37438423645320196, "acc_stderr": 0.03405155380561952, "acc_norm": 0.37438423645320196, "acc_norm_stderr": 0.03405155380561952 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6545454545454545, "acc_stderr": 0.03713158067481913, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.03713158067481913 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6565656565656566, "acc_stderr": 0.03383201223244441, "acc_norm": 0.6565656565656566, "acc_norm_stderr": 0.03383201223244441 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7357512953367875, "acc_stderr": 0.03182155050916645, "acc_norm": 0.7357512953367875, "acc_norm_stderr": 0.03182155050916645 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.47692307692307695, "acc_stderr": 0.025323990861736118, "acc_norm": 0.47692307692307695, "acc_norm_stderr": 0.025323990861736118 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2962962962962963, "acc_stderr": 0.027840811495871937, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.027840811495871937 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.4957983193277311, "acc_stderr": 0.0324773433444811, "acc_norm": 0.4957983193277311, "acc_norm_stderr": 0.0324773433444811 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2781456953642384, "acc_stderr": 0.03658603262763743, "acc_norm": 0.2781456953642384, "acc_norm_stderr": 0.03658603262763743 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7229357798165138, "acc_stderr": 0.01918848259016953, "acc_norm": 0.7229357798165138, "acc_norm_stderr": 0.01918848259016953 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.39814814814814814, "acc_stderr": 0.033384734032074016, "acc_norm": 0.39814814814814814, "acc_norm_stderr": 0.033384734032074016 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6519607843137255, "acc_stderr": 0.03343311240488418, "acc_norm": 0.6519607843137255, "acc_norm_stderr": 0.03343311240488418 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7088607594936709, "acc_stderr": 0.02957160106575337, "acc_norm": 0.7088607594936709, "acc_norm_stderr": 0.02957160106575337 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6098654708520179, "acc_stderr": 0.03273766725459157, "acc_norm": 0.6098654708520179, "acc_norm_stderr": 0.03273766725459157 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6183206106870229, "acc_stderr": 0.0426073515764456, "acc_norm": 0.6183206106870229, "acc_norm_stderr": 0.0426073515764456 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6446280991735537, "acc_stderr": 0.0436923632657398, "acc_norm": 0.6446280991735537, "acc_norm_stderr": 0.0436923632657398 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6111111111111112, "acc_stderr": 0.04712821257426769, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.04712821257426769 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6196319018404908, "acc_stderr": 0.038142698932618374, "acc_norm": 0.6196319018404908, "acc_norm_stderr": 0.038142698932618374 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.39285714285714285, "acc_stderr": 0.04635550135609976, "acc_norm": 0.39285714285714285, "acc_norm_stderr": 0.04635550135609976 }, "harness|hendrycksTest-management|5": { "acc": 0.7184466019417476, "acc_stderr": 0.04453254836326468, "acc_norm": 0.7184466019417476, "acc_norm_stderr": 0.04453254836326468 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8290598290598291, "acc_stderr": 0.024662496845209818, "acc_norm": 0.8290598290598291, "acc_norm_stderr": 0.024662496845209818 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.62, "acc_stderr": 0.04878317312145633, "acc_norm": 0.62, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7088122605363985, "acc_stderr": 0.016246087069701407, "acc_norm": 0.7088122605363985, "acc_norm_stderr": 0.016246087069701407 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5664739884393064, "acc_stderr": 0.02668013476167922, "acc_norm": 0.5664739884393064, "acc_norm_stderr": 0.02668013476167922 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.34972067039106147, "acc_stderr": 0.015949308790233638, "acc_norm": 0.34972067039106147, "acc_norm_stderr": 0.015949308790233638 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5686274509803921, "acc_stderr": 0.028358956313423545, "acc_norm": 0.5686274509803921, "acc_norm_stderr": 0.028358956313423545 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6302250803858521, "acc_stderr": 0.027417996705630998, "acc_norm": 0.6302250803858521, "acc_norm_stderr": 0.027417996705630998 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6018518518518519, "acc_stderr": 0.027237415094592474, "acc_norm": 0.6018518518518519, "acc_norm_stderr": 0.027237415094592474 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3723404255319149, "acc_stderr": 0.028838921471251455, "acc_norm": 0.3723404255319149, "acc_norm_stderr": 0.028838921471251455 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.38722294654498046, "acc_stderr": 0.012441155326854927, "acc_norm": 0.38722294654498046, "acc_norm_stderr": 0.012441155326854927 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4852941176470588, "acc_stderr": 0.03035969707904611, "acc_norm": 0.4852941176470588, "acc_norm_stderr": 0.03035969707904611 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5359477124183006, "acc_stderr": 0.02017548876548405, "acc_norm": 0.5359477124183006, "acc_norm_stderr": 0.02017548876548405 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5909090909090909, "acc_stderr": 0.04709306978661895, "acc_norm": 0.5909090909090909, "acc_norm_stderr": 0.04709306978661895 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6530612244897959, "acc_stderr": 0.030472526026726492, "acc_norm": 0.6530612244897959, "acc_norm_stderr": 0.030472526026726492 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7164179104477612, "acc_stderr": 0.03187187537919796, "acc_norm": 0.7164179104477612, "acc_norm_stderr": 0.03187187537919796 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-virology|5": { "acc": 0.4578313253012048, "acc_stderr": 0.0387862677100236, "acc_norm": 0.4578313253012048, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7251461988304093, "acc_stderr": 0.034240429246915824, "acc_norm": 0.7251461988304093, "acc_norm_stderr": 0.034240429246915824 }, "harness|truthfulqa:mc|0": { "mc1": 0.412484700122399, "mc1_stderr": 0.01723329939957122, "mc2": 0.5940288949043588, "mc2_stderr": 0.015208554054531144 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]