datasetId
stringlengths
2
117
card
stringlengths
19
1.01M
open-llm-leaderboard/details_juhwanlee__llmdo-Mistral-7B-case-1
--- pretty_name: Evaluation run of juhwanlee/llmdo-Mistral-7B-case-1 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [juhwanlee/llmdo-Mistral-7B-case-1](https://huggingface.co/juhwanlee/llmdo-Mistral-7B-case-1)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_juhwanlee__llmdo-Mistral-7B-case-1\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-11T19:13:37.745204](https://huggingface.co/datasets/open-llm-leaderboard/details_juhwanlee__llmdo-Mistral-7B-case-1/blob/main/results_2024-03-11T19-13-37.745204.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6324163759083724,\n\ \ \"acc_stderr\": 0.032412220193761165,\n \"acc_norm\": 0.6377710742584999,\n\ \ \"acc_norm_stderr\": 0.0330673242578233,\n \"mc1\": 0.3084455324357405,\n\ \ \"mc1_stderr\": 0.01616803938315687,\n \"mc2\": 0.45690932932637046,\n\ \ \"mc2_stderr\": 0.014755655698380018\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225402,\n\ \ \"acc_norm\": 0.621160409556314,\n \"acc_norm_stderr\": 0.014175915490000324\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6353316072495518,\n\ \ \"acc_stderr\": 0.004803533333364223,\n \"acc_norm\": 0.8359888468432584,\n\ \ \"acc_norm_stderr\": 0.003695289340514483\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\ \ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\ \ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n\ \ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\ \ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \ \ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\ \ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\ \ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\ \ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \ \ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\ : 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\ \ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\ \ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\ \ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\ \ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\ \ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\ \ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\ \ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\ \ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\ \ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\ acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\ \ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\ \ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.7483870967741936,\n \"acc_stderr\": 0.02468597928623997,\n \"\ acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.02468597928623997\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\ acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\"\ : 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\ \ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\ acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593552,\n\ \ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593552\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \ \ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \ \ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\ \ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\ acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010333,\n \"\ acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010333\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\ acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\ acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906943,\n \ \ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906943\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\ \ \"acc_stderr\": 0.031024411740572206,\n \"acc_norm\": 0.6905829596412556,\n\ \ \"acc_norm_stderr\": 0.031024411740572206\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\ \ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\ : 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\ \ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\ \ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911901,\n\ \ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911901\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\ \ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\ \ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\ \ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\ \ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\ \ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\ \ \"acc_stderr\": 0.013625556907993457,\n \"acc_norm\": 0.8237547892720306,\n\ \ \"acc_norm_stderr\": 0.013625556907993457\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247323,\n\ \ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247323\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40782122905027934,\n\ \ \"acc_stderr\": 0.016435865260914746,\n \"acc_norm\": 0.40782122905027934,\n\ \ \"acc_norm_stderr\": 0.016435865260914746\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.02536060379624256,\n\ \ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.02536060379624256\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\ \ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\ \ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\ \ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \ \ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4530638852672751,\n\ \ \"acc_stderr\": 0.012713845972358981,\n \"acc_norm\": 0.4530638852672751,\n\ \ \"acc_norm_stderr\": 0.012713845972358981\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.02922719246003203,\n\ \ \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.02922719246003203\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507215,\n \ \ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507215\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\ \ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\ \ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675606,\n\ \ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675606\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\ \ \"acc_stderr\": 0.026508590656233257,\n \"acc_norm\": 0.8308457711442786,\n\ \ \"acc_norm_stderr\": 0.026508590656233257\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \ \ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\ \ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\ \ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\ \ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3084455324357405,\n\ \ \"mc1_stderr\": 0.01616803938315687,\n \"mc2\": 0.45690932932637046,\n\ \ \"mc2_stderr\": 0.014755655698380018\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987729\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3912054586808188,\n \ \ \"acc_stderr\": 0.013442502402794302\n }\n}\n```" repo_url: https://huggingface.co/juhwanlee/llmdo-Mistral-7B-case-1 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|arc:challenge|25_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-11T19-13-37.745204.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|gsm8k|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hellaswag|10_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-13-37.745204.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-management|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-13-37.745204.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|truthfulqa:mc|0_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-11T19-13-37.745204.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_11T19_13_37.745204 path: - '**/details_harness|winogrande|5_2024-03-11T19-13-37.745204.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-11T19-13-37.745204.parquet' - config_name: results data_files: - split: 2024_03_11T19_13_37.745204 path: - results_2024-03-11T19-13-37.745204.parquet - split: latest path: - results_2024-03-11T19-13-37.745204.parquet --- # Dataset Card for Evaluation run of juhwanlee/llmdo-Mistral-7B-case-1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [juhwanlee/llmdo-Mistral-7B-case-1](https://huggingface.co/juhwanlee/llmdo-Mistral-7B-case-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_juhwanlee__llmdo-Mistral-7B-case-1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-11T19:13:37.745204](https://huggingface.co/datasets/open-llm-leaderboard/details_juhwanlee__llmdo-Mistral-7B-case-1/blob/main/results_2024-03-11T19-13-37.745204.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6324163759083724, "acc_stderr": 0.032412220193761165, "acc_norm": 0.6377710742584999, "acc_norm_stderr": 0.0330673242578233, "mc1": 0.3084455324357405, "mc1_stderr": 0.01616803938315687, "mc2": 0.45690932932637046, "mc2_stderr": 0.014755655698380018 }, "harness|arc:challenge|25": { "acc": 0.5878839590443686, "acc_stderr": 0.014383915302225402, "acc_norm": 0.621160409556314, "acc_norm_stderr": 0.014175915490000324 }, "harness|hellaswag|10": { "acc": 0.6353316072495518, "acc_stderr": 0.004803533333364223, "acc_norm": 0.8359888468432584, "acc_norm_stderr": 0.003695289340514483 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6296296296296297, "acc_stderr": 0.041716541613545426, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.041716541613545426 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6710526315789473, "acc_stderr": 0.03823428969926605, "acc_norm": 0.6710526315789473, "acc_norm_stderr": 0.03823428969926605 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6943396226415094, "acc_stderr": 0.028353298073322663, "acc_norm": 0.6943396226415094, "acc_norm_stderr": 0.028353298073322663 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7222222222222222, "acc_stderr": 0.037455547914624555, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.037455547914624555 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6416184971098265, "acc_stderr": 0.036563436533531585, "acc_norm": 0.6416184971098265, "acc_norm_stderr": 0.036563436533531585 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.04878608714466996, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.04878608714466996 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.79, "acc_stderr": 0.04093601807403326, "acc_norm": 0.79, "acc_norm_stderr": 0.04093601807403326 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5829787234042553, "acc_stderr": 0.03223276266711712, "acc_norm": 0.5829787234042553, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.046920083813689104, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.046920083813689104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41798941798941797, "acc_stderr": 0.025402555503260912, "acc_norm": 0.41798941798941797, "acc_norm_stderr": 0.025402555503260912 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42857142857142855, "acc_stderr": 0.0442626668137991, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.0442626668137991 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7483870967741936, "acc_stderr": 0.02468597928623997, "acc_norm": 0.7483870967741936, "acc_norm_stderr": 0.02468597928623997 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.67, "acc_stderr": 0.047258156262526066, "acc_norm": 0.67, "acc_norm_stderr": 0.047258156262526066 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7878787878787878, "acc_stderr": 0.03192271569548301, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.03192271569548301 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.803030303030303, "acc_stderr": 0.028335609732463362, "acc_norm": 0.803030303030303, "acc_norm_stderr": 0.028335609732463362 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8756476683937824, "acc_stderr": 0.023814477086593552, "acc_norm": 0.8756476683937824, "acc_norm_stderr": 0.023814477086593552 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.658974358974359, "acc_stderr": 0.02403548967633508, "acc_norm": 0.658974358974359, "acc_norm_stderr": 0.02403548967633508 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.028661201116524575, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.028661201116524575 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6848739495798319, "acc_stderr": 0.030176808288974337, "acc_norm": 0.6848739495798319, "acc_norm_stderr": 0.030176808288974337 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8256880733944955, "acc_stderr": 0.016265675632010333, "acc_norm": 0.8256880733944955, "acc_norm_stderr": 0.016265675632010333 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4861111111111111, "acc_stderr": 0.03408655867977748, "acc_norm": 0.4861111111111111, "acc_norm_stderr": 0.03408655867977748 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7990196078431373, "acc_stderr": 0.028125972265654373, "acc_norm": 0.7990196078431373, "acc_norm_stderr": 0.028125972265654373 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7721518987341772, "acc_stderr": 0.02730348459906943, "acc_norm": 0.7721518987341772, "acc_norm_stderr": 0.02730348459906943 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.031024411740572206, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.031024411740572206 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7633587786259542, "acc_stderr": 0.03727673575596913, "acc_norm": 0.7633587786259542, "acc_norm_stderr": 0.03727673575596913 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7975460122699386, "acc_stderr": 0.03157065078911901, "acc_norm": 0.7975460122699386, "acc_norm_stderr": 0.03157065078911901 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5178571428571429, "acc_stderr": 0.047427623612430116, "acc_norm": 0.5178571428571429, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.02190190511507333, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.02190190511507333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8237547892720306, "acc_stderr": 0.013625556907993457, "acc_norm": 0.8237547892720306, "acc_norm_stderr": 0.013625556907993457 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.708092485549133, "acc_stderr": 0.024476994076247323, "acc_norm": 0.708092485549133, "acc_norm_stderr": 0.024476994076247323 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.40782122905027934, "acc_stderr": 0.016435865260914746, "acc_norm": 0.40782122905027934, "acc_norm_stderr": 0.016435865260914746 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7320261437908496, "acc_stderr": 0.02536060379624256, "acc_norm": 0.7320261437908496, "acc_norm_stderr": 0.02536060379624256 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.707395498392283, "acc_stderr": 0.02583989833487798, "acc_norm": 0.707395498392283, "acc_norm_stderr": 0.02583989833487798 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7345679012345679, "acc_stderr": 0.024569223600460845, "acc_norm": 0.7345679012345679, "acc_norm_stderr": 0.024569223600460845 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48936170212765956, "acc_stderr": 0.029820747191422473, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.029820747191422473 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4530638852672751, "acc_stderr": 0.012713845972358981, "acc_norm": 0.4530638852672751, "acc_norm_stderr": 0.012713845972358981 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6360294117647058, "acc_stderr": 0.02922719246003203, "acc_norm": 0.6360294117647058, "acc_norm_stderr": 0.02922719246003203 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.673202614379085, "acc_stderr": 0.018975427920507215, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.018975427920507215 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.04607582090719976, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.04607582090719976 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7142857142857143, "acc_stderr": 0.028920583220675606, "acc_norm": 0.7142857142857143, "acc_norm_stderr": 0.028920583220675606 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8308457711442786, "acc_stderr": 0.026508590656233257, "acc_norm": 0.8308457711442786, "acc_norm_stderr": 0.026508590656233257 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.0377525168068637, "acc_norm": 0.83, "acc_norm_stderr": 0.0377525168068637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5060240963855421, "acc_stderr": 0.03892212195333045, "acc_norm": 0.5060240963855421, "acc_norm_stderr": 0.03892212195333045 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8187134502923976, "acc_stderr": 0.029547741687640044, "acc_norm": 0.8187134502923976, "acc_norm_stderr": 0.029547741687640044 }, "harness|truthfulqa:mc|0": { "mc1": 0.3084455324357405, "mc1_stderr": 0.01616803938315687, "mc2": 0.45690932932637046, "mc2_stderr": 0.014755655698380018 }, "harness|winogrande|5": { "acc": 0.7916337805840569, "acc_stderr": 0.011414554399987729 }, "harness|gsm8k|5": { "acc": 0.3912054586808188, "acc_stderr": 0.013442502402794302 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
AdapterOcean/physics_dataset_standardized
--- dataset_info: features: - name: message dtype: string - name: message_type dtype: string - name: message_id dtype: int64 - name: conversation_id dtype: int64 splits: - name: train num_bytes: 50580506 num_examples: 40000 download_size: 22905844 dataset_size: 50580506 configs: - config_name: default data_files: - split: train path: data/train-* ---
AchrafLou/image_captioned_40
--- dataset_info: features: - name: text dtype: string - name: image dtype: image splits: - name: train num_bytes: 246618.0 num_examples: 38 - name: validation num_bytes: 59992.0 num_examples: 10 download_size: 280777 dataset_size: 306610.0 --- # Dataset Card for "image_captioned_40" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Seenka/directv-zocalos_1.0fps_21-08-2023_24-08-2023
--- dataset_info: features: - name: image dtype: image - name: image_filename dtype: string - name: frame_time dtype: time64[us] - name: video_storage_path dtype: string - name: zocalo_id dtype: string - name: frame_number dtype: int64 - name: is_L_shape dtype: bool - name: horizontal_check dtype: bool - name: vertical_check dtype: bool - name: black_image dtype: bool - name: horizontal_xmin dtype: int64 - name: horizontal_xmax dtype: int64 - name: horizontal_ymin dtype: int64 - name: horizontal_ymax dtype: int64 - name: vertical_xmin dtype: int64 - name: vertical_xmax dtype: int64 - name: vertical_ymin dtype: int64 - name: vertical_ymax dtype: int64 - name: cropped_image_horizontal dtype: image - name: cropped_image_vertical dtype: 'null' - name: width dtype: int64 - name: height dtype: int64 - name: embedding_horizontal sequence: float32 splits: - name: train num_bytes: 8554665.0 num_examples: 10 download_size: 4263397 dataset_size: 8554665.0 --- # Dataset Card for "directv-zocalos_1.0fps_21-08-2023_24-08-2023" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
desiai/archiveoldsamachaar
--- license: odc-by ---
Elapsedf/Plant-Pathology
--- license: mit ---
joey234/mmlu-high_school_european_history-original-neg-prepend
--- dataset_info: features: - name: question dtype: string - name: choices sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: neg_prompt dtype: string splits: - name: test num_bytes: 468329 num_examples: 140 download_size: 250111 dataset_size: 468329 --- # Dataset Card for "mmlu-high_school_european_history-original-neg-prepend" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
cmxuebuhui/glm_train
--- license: apache-2.0 ---
gayanin/babylon-native-mixed
--- dataset_info: - config_name: prob-0.1 features: - name: refs dtype: string - name: trans dtype: string splits: - name: train num_bytes: 707260 num_examples: 5293 - name: test num_bytes: 79603 num_examples: 662 - name: validation num_bytes: 77176 num_examples: 662 download_size: 498401 dataset_size: 864039 - config_name: prob-0.2 features: - name: refs dtype: string - name: trans dtype: string splits: - name: train num_bytes: 707658 num_examples: 5293 - name: test num_bytes: 79464 num_examples: 662 - name: validation num_bytes: 77409 num_examples: 662 download_size: 513529 dataset_size: 864531 - config_name: prob-0.3 features: - name: refs dtype: string - name: trans dtype: string splits: - name: train num_bytes: 707952 num_examples: 5293 - name: test num_bytes: 79465 num_examples: 662 - name: validation num_bytes: 77440 num_examples: 662 download_size: 525969 dataset_size: 864857 - config_name: prob-0.4 features: - name: refs dtype: string - name: trans dtype: string splits: - name: train num_bytes: 708853 num_examples: 5293 - name: test num_bytes: 79806 num_examples: 662 - name: validation num_bytes: 77283 num_examples: 662 download_size: 536929 dataset_size: 865942 - config_name: prob-0.5 features: - name: refs dtype: string - name: trans dtype: string splits: - name: train num_bytes: 709628 num_examples: 5293 - name: test num_bytes: 79791 num_examples: 662 - name: validation num_bytes: 77038 num_examples: 662 download_size: 545198 dataset_size: 866457 configs: - config_name: prob-0.1 data_files: - split: train path: prob-0.1/train-* - split: test path: prob-0.1/test-* - split: validation path: prob-0.1/validation-* - config_name: prob-0.2 data_files: - split: train path: prob-0.2/train-* - split: test path: prob-0.2/test-* - split: validation path: prob-0.2/validation-* - config_name: prob-0.3 data_files: - split: train path: prob-0.3/train-* - split: test path: prob-0.3/test-* - split: validation path: prob-0.3/validation-* - config_name: prob-0.4 data_files: - split: train path: prob-0.4/train-* - split: test path: prob-0.4/test-* - split: validation path: prob-0.4/validation-* - config_name: prob-0.5 data_files: - split: train path: prob-0.5/train-* - split: test path: prob-0.5/test-* - split: validation path: prob-0.5/validation-* ---
argilla/ultrafeedback-binarized-preferences-cleaned-kto
--- language: - en license: mit size_categories: - 10K<n<100K task_categories: - text-generation pretty_name: UltraFeedback Binarized Preferences Cleaned KTO dataset_info: features: - name: prompt dtype: string - name: completion dtype: string - name: label dtype: bool - name: model dtype: string - name: average_rating dtype: float64 - name: annotations struct: - name: helpfulness struct: - name: Rating dtype: string - name: Rationale dtype: string - name: Rationale For Rating dtype: string - name: Type sequence: string - name: honesty struct: - name: Rating dtype: string - name: Rationale dtype: string - name: instruction_following struct: - name: Rating dtype: string - name: Rationale dtype: string - name: truthfulness struct: - name: Rating dtype: string - name: Rationale dtype: string - name: Rationale For Rating dtype: string - name: Type sequence: string - name: source dtype: string splits: - name: train num_bytes: 673880007 num_examples: 230720 download_size: 226134542 dataset_size: 673880007 configs: - config_name: default data_files: - split: train path: data/train-* tags: - kto - preference - ultrafeedback --- # UltraFeedback - Binarized using the Average of Preference Ratings (Cleaned) KTO > A KTO signal transformed version of the highly loved [UltraFeedback Binarized Preferences Cleaned](https://huggingface.co/datasets/argilla/ultrafeedback-binarized-preferences-cleaned), the preferred dataset by Argilla to use from now on when fine-tuning on UltraFeedback This dataset represents a new iteration on top of [`argilla/ultrafeedback-binarized-preferences`](https://huggingface.co/argilla/ultrafeedback-binarized-preferences), and is the **recommended and preferred dataset by Argilla to use from now on when fine-tuning on UltraFeedback**. Read more about Argilla's approach towards UltraFeedback binarization at [`argilla/ultrafeedback-binarized-preferences/README.md`](https://huggingface.co/datasets/argilla/ultrafeedback-binarized-preferences/blob/main/README.md). ## Why KTO? The [KTO paper](https://arxiv.org/abs/2402.01306) states: - KTO matches or exceeds DPO performance at scales from 1B to 30B parameters.1 That is, taking a preference dataset of n DPO pairs and breaking it up into 2n examples for KTO can yield better generations, despite the model ostensibly learning from a weaker signal. - KTO can handle extreme data imbalances, matching DPO performance while using up to 90% fewer desirable examples (i.e., examples of good generations). Its success thus cannot be ascribed to the alignment data being sourced from a preference dataset. - When the pretrained model is sufficiently good, one can skip supervised finetuning and go straight to KTO without a loss in generation quality. In contrast, we find that without doing SFT first, DPO-aligned models are significantly worse at all scales. ## Reproduce KTO Transformation Orginal [UltraFeedback binarized prefrence cleaned DPO dataset](https://huggingface.co/datasets/argilla/ultrafeedback-binarized-preferences-cleaned) <a target="_blank" href="https://colab.research.google.com/drive/10MwyxzcQogwO8e1ZcVu7aGTQvjXWpFuD?usp=sharing"> <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/> </a>
malteos/m_hellaswag
--- configs: - config_name: ar data_files: - split: validation path: ar_validation* dataset_info: features: # "endings": datasets.features.Sequence(datasets.Value("string")), - name: ind dtype: int32 - name: activity_label dtype: string - name: ctx_a dtype: string - name: ctx_a dtype: string - name: ctx dtype: string - name: endings dtype: string - name: source_id dtype: string - name: split dtype: string - name: split_type dtype: string - name: label dtype: string splits: - name: validation --- Mirror of https://github.com/nlp-uoregon/mlmm-evaluation
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-22000
--- dataset_info: features: - name: input_ids sequence: sequence: int32 - name: attention_mask sequence: sequence: int8 - name: labels sequence: sequence: int64 splits: - name: train num_bytes: 13336000 num_examples: 1000 download_size: 661232 dataset_size: 13336000 configs: - config_name: default data_files: - split: train path: data/train-* ---
abhilashpotluri/lfqa_summary
--- license: cc-by-sa-4.0 task_categories: - summarization language: - en size_categories: - 1K<n<10K pretty_name: lfqa_summary --- # Dataset Card for LFQA Summary ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Additional Information](#additional-information) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) ## Dataset Description - **Repository:** [Repo](https://github.com/utcsnlp/lfqa_summary) - **Paper:** [Concise Answers to Complex Questions: Summarization of Long-Form Answers](TODO) - **Point of Contact:** acpotluri[at]utexas.edu ### Dataset Summary This dataset contains summarization data for long-form question answers. ### Languages The dataset contains data in English. ## Dataset Structure ### Data Instances Each instance is a (question, long-form answer) pair from one of the three data sources -- ELI5, WebGPT, and NQ. ### Data Fields Each instance is in a json dictionary format with the following fields: * `type`: The type of the annotation, all data should have `summary` as the value. * `dataset`: The dataset this QA pair belongs to, one of [`NQ`, `ELI5`, `Web-GPT`]. * `q_id`: The question id, same as the original NQ or ELI5 dataset. * `a_id`: The answer id, same as the original ELI5 dataset. For NQ, we populate a dummy `a_id` (1). * `question`: The question. * `answer_paragraph`: The answer paragraph. * `answer_sentences`: The list of answer sentences, tokenzied from the answer paragraph. * `summary_sentences`: The list of summary sentence index (starting from 1). * `is_summary_count`: The list of count of annotators selecting this sentence as summary for the sentence in `answer_sentences`. * `is_summary_1`: List of boolean value indicating whether annotator one selected the corresponding sentence as a summary sentence. * `is_summary_2`: List of boolean value indicating whether annotator two selected the corresponding sentence as a summary sentence. * `is_summary_3`: List of boolean value indicating whether annotator three selected the corresponding sentence as a summary sentence. ### Data Splits The train/dev/test are provided in the uploaded dataset. ## Dataset Creation Please refer to our [paper](TODO) and datasheet for details on dataset creation, annotation process, and discussion of limitations. ## Additional Information ### Licensing Information https://creativecommons.org/licenses/by-sa/4.0/legalcode ### Citation Information ``` @inproceedings{TODO, title = {Concise Answers to Complex Questions: Summarization of Long-Form Answers}, author = {Potluri,Abhilash and Xu, Fangyuan and Choi, Eunsol}, year = 2023, booktitle = {Proceedings of the Annual Meeting of the Association for Computational Linguistics}, note = {Long paper} } ```
corysilas/VzCory
--- license: openrail ---
irds/mmarco_de_train
--- pretty_name: '`mmarco/de/train`' viewer: false source_datasets: ['irds/mmarco_de'] task_categories: - text-retrieval --- # Dataset Card for `mmarco/de/train` The `mmarco/de/train` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package. For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/de/train). # Data This dataset provides: - `queries` (i.e., topics); count=808,731 - `qrels`: (relevance assessments); count=532,761 - `docpairs`; count=39,780,811 - For `docs`, use [`irds/mmarco_de`](https://huggingface.co/datasets/irds/mmarco_de) ## Usage ```python from datasets import load_dataset queries = load_dataset('irds/mmarco_de_train', 'queries') for record in queries: record # {'query_id': ..., 'text': ...} qrels = load_dataset('irds/mmarco_de_train', 'qrels') for record in qrels: record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...} docpairs = load_dataset('irds/mmarco_de_train', 'docpairs') for record in docpairs: record # {'query_id': ..., 'doc_id_a': ..., 'doc_id_b': ...} ``` Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the data in 🤗 Dataset format. ## Citation Information ``` @article{Bonifacio2021MMarco, title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset}, author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira}, year={2021}, journal={arXiv:2108.13897} } ```
NLPC-UOM/document_alignment_dataset-Sinhala-Tamil-English
--- task_categories: - sentence-similarity language: - si - ta - en --- ### **Dataset summary** This is a gold-standard benchmark dataset for document alignment, between Sinhala-English-Tamil languages. Data had been crawled from the following news websites. | News Source | url | | ------------- |-----------------------------| | Army | https://www.army.lk/ | | Hiru | http://www.hirunews.lk | | ITN | https://www.newsfirst.lk | | Newsfirst | https://www.itnnews.lk | The aligned documents have been manually annotated. ### **Dataset** The folder structure for each news source is as follows. ```python army |--Sinhala |--English |--Tamil |--armynews_english_sinhala.txt |--armynews_english_tamil.txt |--armynews_sinhala_tamil.txt ``` Sinhala/English/Tamil - contain the crawled data for the respective news source army_news_english_sinhala.txt - contains the annotated aligned documents between English and Sinhala languages. armynews_english_tamil.txt - contains the annotated aligned documents between English and Tamil languages. armynews_sinhala_tamil.txt - contains the annotated aligned documents between Sinhala and Tamil languages. ## **Citation Information** @article{fernando2022exploiting,<br/> title={Exploiting bilingual lexicons to improve multilingual embedding-based document and sentence alignment for low-resource languages},<br/> author={Fernando, Aloka and Ranathunga, Surangika and Sachintha, Dilan and Piyarathna, Lakmali and Rajitha, Charith},<br/> journal={Knowledge and Information Systems},<br/> pages={1--42},<br/> year={2022},<br/> publisher={Springer}<br/> }
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-35000
--- dataset_info: features: - name: input_ids sequence: sequence: int32 - name: attention_mask sequence: sequence: int8 - name: labels sequence: sequence: int64 splits: - name: train num_bytes: 13336000 num_examples: 1000 download_size: 1089471 dataset_size: 13336000 configs: - config_name: default data_files: - split: train path: data/train-* ---
adamjweintraut/eli5_lfqa_top
--- dataset_info: features: - name: index dtype: int64 - name: q_id dtype: string - name: question dtype: string - name: best_answer dtype: string - name: all_answers sequence: string - name: num_answers dtype: int64 - name: top_answers sequence: string - name: num_top_answers dtype: int64 - name: context dtype: string - name: orig dtype: string - name: target dtype: string splits: - name: train num_bytes: 2794678797.7618504 num_examples: 183333 - name: test num_bytes: 349340566.11907476 num_examples: 22917 - name: validation num_bytes: 349340566.11907476 num_examples: 22917 download_size: 2106260396 dataset_size: 3493359930.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* - split: validation path: data/validation-* ---
liuyanchen1015/MULTI_VALUE_qqp_me_coordinate_subjects
--- dataset_info: features: - name: question1 dtype: string - name: question2 dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: dev num_bytes: 4841 num_examples: 19 - name: test num_bytes: 50642 num_examples: 197 - name: train num_bytes: 37708 num_examples: 145 download_size: 65241 dataset_size: 93191 --- # Dataset Card for "MULTI_VALUE_qqp_me_coordinate_subjects" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
huggingartists/ajr
--- language: - en tags: - huggingartists - lyrics --- # Dataset Card for "huggingartists/ajr" ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [How to use](#how-to-use) - [Dataset Structure](#dataset-structure) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [About](#about) ## Dataset Description - **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists) - **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists) - **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Size of the generated dataset:** 0.216409 MB <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://images.genius.com/84cbe6ced3b5398a810e82a9b65cff26.1000x1000x1.png&#39;)"> </div> </div> <a href="https://huggingface.co/huggingartists/ajr"> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div> </a> <div style="text-align: center; font-size: 16px; font-weight: 800">AJR</div> <a href="https://genius.com/artists/ajr"> <div style="text-align: center; font-size: 14px;">@ajr</div> </a> </div> ### Dataset Summary The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists. Model is available [here](https://huggingface.co/huggingartists/ajr). ### Supported Tasks and Leaderboards [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Languages en ## How to use How to load this dataset directly with the datasets library: ```python from datasets import load_dataset dataset = load_dataset("huggingartists/ajr") ``` ## Dataset Structure An example of 'train' looks as follows. ``` This example was too long and was cropped: { "text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..." } ``` ### Data Fields The data fields are the same among all splits. - `text`: a `string` feature. ### Data Splits | train |validation|test| |------:|---------:|---:| |142| -| -| 'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code: ```python from datasets import load_dataset, Dataset, DatasetDict import numpy as np datasets = load_dataset("huggingartists/ajr") train_percentage = 0.9 validation_percentage = 0.07 test_percentage = 0.03 train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))]) datasets = DatasetDict( { 'train': Dataset.from_dict({'text': list(train)}), 'validation': Dataset.from_dict({'text': list(validation)}), 'test': Dataset.from_dict({'text': list(test)}) } ) ``` ## Dataset Creation ### Curation Rationale [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Source Data #### Initial Data Collection and Normalization [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the source language producers? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Annotations #### Annotation process [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the annotators? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Personal and Sensitive Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Discussion of Biases [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Other Known Limitations [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Additional Information ### Dataset Curators [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Licensing Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Citation Information ``` @InProceedings{huggingartists, author={Aleksey Korshuk} year=2021 } ``` ## About *Built by Aleksey Korshuk* [![Follow](https://img.shields.io/github/followers/AlekseyKorshuk?style=social)](https://github.com/AlekseyKorshuk) [![Follow](https://img.shields.io/twitter/follow/alekseykorshuk?style=social)](https://twitter.com/intent/follow?screen_name=alekseykorshuk) [![Follow](https://img.shields.io/badge/dynamic/json?color=blue&label=Telegram%20Channel&query=%24.result&url=https%3A%2F%2Fapi.telegram.org%2Fbot1929545866%3AAAFGhV-KKnegEcLiyYJxsc4zV6C-bdPEBtQ%2FgetChatMemberCount%3Fchat_id%3D-1001253621662&style=social&logo=telegram)](https://t.me/joinchat/_CQ04KjcJ-4yZTky) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/AlekseyKorshuk/huggingartists?style=social)](https://github.com/AlekseyKorshuk/huggingartists)
autoevaluate/autoeval-eval-kmfoda__booksum-kmfoda__booksum-373400-1514054915
--- type: predictions tags: - autotrain - evaluation datasets: - kmfoda/booksum eval_info: task: summarization model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP14 metrics: [] dataset_name: kmfoda/booksum dataset_config: kmfoda--booksum dataset_split: test col_mapping: text: chapter target: summary_text --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Summarization * Model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP14 * Dataset: kmfoda/booksum * Config: kmfoda--booksum * Split: test To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model.
Anees-Aslam/Cloud
--- license: cc-by-nc-nd-4.0 ---
murilo22/vozclone1
--- license: openrail ---
CyberHarem/hyuuga_hinata_naruto
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of hyuuga_hinata (NARUTO) This is the dataset of hyuuga_hinata (NARUTO), containing 200 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
lsy641/PsyQA
--- license: mit --- The data is originally source from (Sun et al,2021). (Liu et al, 2023) processed the data to make it a dataset vis huggingface api with taining/validation/testing splitting **Please cite:** ``` @misc{liu2023enhancing, title={Enhancing Long-form Text Generation in Mental Health with Task-adaptive Tokenization}, author={Siyang Liu and Naihao Deng and Sahand Sabour and Yilin Jia and Minlie Huang and Rada Mihalcea}, year={2023}, eprint={2310.05317}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ``` @inproceedings{sun2021psyqa, title={PsyQA: A Chinese Dataset for Generating Long Counseling Text for Mental Health Support}, author={Sun, Hao and Lin, Zhenru and Zheng, Chujie and Liu, Siyang and Huang, Minlie}, booktitle={Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021}, pages={1489--1503}, year={2021} } ```
nirantk/dbpedia-entities-google-palm-gemini-embedding-001-100K
--- dataset_info: features: - name: _id dtype: string - name: title dtype: string - name: text dtype: string - name: embedding sequence: float64 splits: - name: train num_bytes: 653564666 num_examples: 100000 download_size: 671003094 dataset_size: 653564666 configs: - config_name: default data_files: - split: train path: data/train-* license: apache-2.0 task_categories: - feature-extraction language: - en pretty_name: 'DBPedia 100K: Gemini Google Embedding Model 001' size_categories: - 10K<n<100K --- # Dataset Card for DBPedia 100K: Gemini Google Embedding Model 001 100K vectors from DBPedia! Embedding Model: Google's latest Embedding Model 001 -- the successor to the Gecko Models! ## Dataset Details ### Dataset Description 100K Google Embeddings -- 768 dimensions Created: December 2023 Text used for Embedding: title (string) + text (string) Embedding Model: Google's `models/embedding-001` - **Curated by:** [Nirant Kasliwal](https://nirantk.com/about) - **Funded by:** [Qdrant Gmbh](https://qdrant.tech) - **Language(s) (NLP):** English - **License:** Apache License 2.0 ## Uses This dataset is useful for benchmarking the embedding performance, testing the vectors on an existing dataset. E.g. you can compare Google and OpenAI for the same text using this dataset. ## Dataset Creation Unlike the OpenAI Embedding, this creation used "title" and "content" attribute of the embedding model along with `task_type="retrieval_document"` ```python result = genai.embed_content( model="models/embedding-001", content="Qdrant is the best vector search engine to use with Gemini", task_type="retrieval_document", title="Qdrant x Gemini", ) ``` ## Source Data This dataset is a slice of the earlier work from @KShivendu_: https://huggingface.co/datasets/KShivendu/dbpedia-entities-openai-1M/ The 1M dataset was generated from the first 1M entries of https://huggingface.co/datasets/BeIR/dbpedia-entity From those 1M, I selected 100K at random and created embedding from them. ### Recommendations The dataset is released as is, I'm not aware of biases, limitations or other risks arising from the use of embedding models and datasets. Embedding models are cryptographically secure and should not be used for security use cases. ## Dataset Card Authors - [Nirant Kasliwal](https://nirantk.com/about/) ## Dataset Card Contact Write to nirant [dot] kasliwal [at] qdrant.com if you have questions!
TeeZee/dolly-15k-pirate-speech
--- license: cc-by-sa-3.0 task_categories: - question-answering - summarization - text-generation language: - en pretty_name: pirate speech --- Dataset for writing style transfer experimentation based on article: https://ai-r.com/blog/pirate-linguistics-and-tone-of-voice-fine-tuning-llms-to-talk-like-swashbucklers Only responses are in 'pirate speech' arrr python library was used to simply change original responses to 'pirate speech' responses https://pypi.org/project/arrr/
liuyanchen1015/MULTI_VALUE_mnli_it_is_non_referential
--- dataset_info: features: - name: premise dtype: string - name: hypothesis dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: score dtype: int64 splits: - name: dev_matched num_bytes: 60052 num_examples: 288 - name: dev_mismatched num_bytes: 76812 num_examples: 317 - name: test_matched num_bytes: 45727 num_examples: 222 - name: test_mismatched num_bytes: 68535 num_examples: 292 - name: train num_bytes: 2157021 num_examples: 10552 download_size: 1446926 dataset_size: 2408147 --- # Dataset Card for "MULTI_VALUE_mnli_it_is_non_referential" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
sudeshna84/samanantar-bn-hi
--- task_categories: - translation language: - hi - bn pretty_name: samanantar ---
reshinthadith/dfg_augmented_mbpp
--- dataset_info: features: - name: prompt dtype: string - name: output dtype: string - name: code dtype: string splits: - name: train num_bytes: 32138 num_examples: 95 download_size: 17897 dataset_size: 32138 --- # Dataset Card for "dfg_augmented_mbpp" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
chetahy0711/CS6301_sampledata
--- dataset_info: features: - name: image dtype: image - name: expression dtype: string - name: img_width dtype: int64 - name: img_height dtype: int64 - name: x dtype: float64 - name: y dtype: float64 - name: w dtype: float64 - name: h dtype: float64 splits: - name: train num_bytes: 3093853.0 num_examples: 20 download_size: 3094944 dataset_size: 3093853.0 --- # Dataset Card for "CS6301_sampledata" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CyberHarem/kiyoshimo_kantaicollection
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of kiyoshimo/清霜/清霜 (Kantai Collection) This is the dataset of kiyoshimo/清霜/清霜 (Kantai Collection), containing 500 images and their tags. The core tags of this character are `long_hair, grey_hair, ahoge, twintails, low_twintails, hair_between_eyes, very_long_hair, grey_eyes, bow`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 445.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiyoshimo_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 290.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiyoshimo_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1044 | 575.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiyoshimo_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 405.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiyoshimo_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1044 | 751.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiyoshimo_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/kiyoshimo_kantaicollection', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bowtie, dress, grey_pantyhose, long_sleeves, looking_at_viewer, school_uniform, solo, white_shirt, halterneck, smile, white_background, multicolored_hair, simple_background | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bowtie, dress, looking_at_viewer, school_uniform, simple_background, solo, white_shirt, single_hair_bun, smile, white_background, halterneck, long_sleeves, upper_body, blush, one-hour_drawing_challenge, open_mouth, twitter_username | | 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, bowtie, long_sleeves, school_uniform, solo, white_shirt, grey_pantyhose, looking_at_viewer, open_mouth, white_background, simple_background, :d, machinery, sleeveless_dress, turret, blush, teeth | | 3 | 15 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, bowtie, grey_pantyhose, school_uniform, solo, white_shirt, halterneck, full_body, lace-up_boots, long_sleeves, single_hair_bun, smile, standing, looking_at_viewer, open_mouth, white_background, chibi, purple_dress, simple_background | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | blush, looking_at_viewer, navel, nipples, nude, pussy, small_breasts, 1girl, solo, uncensored, cleft_of_venus, open_mouth, :d, blue_eyes, collarbone, hair_ribbon, loli, purple_eyes, standing | | 5 | 7 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, alternate_costume, blue_hair, full_body, looking_at_viewer, multicolored_hair, solo, simple_background, smile, white_socks, twitter_username, white_background, bag, holding, long_sleeves, standing, white_shirt, black_footwear, dress, flower, mary_janes, open_mouth | | 6 | 7 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, blush, looking_at_viewer, nipples, panty_pull, polka_dot_panties, small_breasts, solo, green_panties, navel, pussy, smile, pantyhose_pull, bow_panties, bra_lift, dakimakura_(medium), grey_pantyhose, on_back, open_shirt | | 7 | 11 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, detached_collar, fake_animal_ears, playboy_bunny, rabbit_ears, wrist_cuffs, purple_leotard, solo, strapless_leotard, grey_pantyhose, small_breasts, white_background, rabbit_tail, simple_background, aqua_bowtie, covered_navel, fishnet_pantyhose, highleg_leotard, smile, thighband_pantyhose, adapted_costume, full_body, purple_footwear, cowboy_shot, dated, hair_bun, looking_at_viewer, rudder_footwear | | 8 | 9 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | alternate_costume, kimono, looking_at_viewer, 1girl, floral_print, hair_flower, solo, wide_sleeves, obi, blue_hair, blush, happy_new_year, long_sleeves, multicolored_hair, open_mouth, twitter_username, :d, upper_body | | 9 | 11 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, solo, looking_at_viewer, single_hair_bun, smile, simple_background, white_background, collarbone, covered_navel, highleg_swimsuit, small_breasts, alternate_costume, competition_swimsuit, flat_chest, standing, barefoot, cowboy_shot, open_mouth, ass_visible_through_thighs, full_body | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bowtie | dress | grey_pantyhose | long_sleeves | looking_at_viewer | school_uniform | solo | white_shirt | halterneck | smile | white_background | multicolored_hair | simple_background | single_hair_bun | upper_body | blush | one-hour_drawing_challenge | open_mouth | twitter_username | :d | machinery | sleeveless_dress | turret | teeth | full_body | lace-up_boots | standing | chibi | purple_dress | navel | nipples | nude | pussy | small_breasts | uncensored | cleft_of_venus | blue_eyes | collarbone | hair_ribbon | loli | purple_eyes | alternate_costume | blue_hair | white_socks | bag | holding | black_footwear | flower | mary_janes | panty_pull | polka_dot_panties | green_panties | pantyhose_pull | bow_panties | bra_lift | dakimakura_(medium) | on_back | open_shirt | detached_collar | fake_animal_ears | playboy_bunny | rabbit_ears | wrist_cuffs | purple_leotard | strapless_leotard | rabbit_tail | aqua_bowtie | covered_navel | fishnet_pantyhose | highleg_leotard | thighband_pantyhose | adapted_costume | purple_footwear | cowboy_shot | dated | hair_bun | rudder_footwear | kimono | floral_print | hair_flower | wide_sleeves | obi | happy_new_year | highleg_swimsuit | competition_swimsuit | flat_chest | barefoot | ass_visible_through_thighs | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:--------|:-----------------|:---------------|:--------------------|:-----------------|:-------|:--------------|:-------------|:--------|:-------------------|:--------------------|:--------------------|:------------------|:-------------|:--------|:-----------------------------|:-------------|:-------------------|:-----|:------------|:-------------------|:---------|:--------|:------------|:----------------|:-----------|:--------|:---------------|:--------|:----------|:-------|:--------|:----------------|:-------------|:-----------------|:------------|:-------------|:--------------|:-------|:--------------|:--------------------|:------------|:--------------|:------|:----------|:-----------------|:---------|:-------------|:-------------|:--------------------|:----------------|:-----------------|:--------------|:-----------|:----------------------|:----------|:-------------|:------------------|:-------------------|:----------------|:--------------|:--------------|:-----------------|:--------------------|:--------------|:--------------|:----------------|:--------------------|:------------------|:----------------------|:------------------|:------------------|:--------------|:--------|:-----------|:------------------|:---------|:---------------|:--------------|:---------------|:------|:-----------------|:-------------------|:-----------------------|:-------------|:-----------|:-----------------------------| | 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | X | X | X | X | X | X | | | X | | X | | | X | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 15 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | X | X | X | X | X | X | X | X | X | | X | X | | | | X | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | | | | X | | X | | | | | | | | | X | | X | | X | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 7 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | | X | X | | X | X | | X | X | X | X | | | | | X | X | | | | | | X | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 7 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | | X | | X | | X | | | X | | | | | | X | | | | | | | | | | | | | | X | X | | X | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 11 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | | X | | X | | X | | | X | X | | X | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | 8 | 9 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | | | X | X | | X | | | | | X | | | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | 9 | 11 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | | | | X | | X | | | X | X | | X | X | | | | X | | | | | | | X | | X | | | | | | | X | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | X | X | X | X | X |
arieg/fma_small_images
--- dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': '000002' '1': '000005' '2': '000010' '3': '000140' '4': '000141' '5': 000148 '6': 000182 '7': 000190 '8': 000193 '9': 000194 '10': 000197 '11': '000200' '12': '000203' '13': '000204' '14': '000207' '15': '000210' '16': '000211' '17': '000212' '18': '000213' '19': '000255' '20': '000256' '21': 000368 '22': '000424' '23': 000459 '24': '000534' '25': '000540' '26': '000546' '27': '000574' '28': '000602' '29': '000615' '30': '000620' '31': '000621' '32': '000625' '33': '000666' '34': '000667' '35': '000676' '36': 000690 '37': 000694 '38': 000695 '39': '000704' '40': '000705' '41': '000706' '42': '000707' '43': 000708 '44': 000709 '45': '000714' '46': '000715' '47': '000716' '48': 000718 '49': '000777' '50': 000814 '51': 000821 '52': 000822 '53': 000825 '54': 000853 '55': 000890 '56': 000892 '57': 000897 '58': 000993 '59': 000995 '60': 000997 '61': 000998 '62': 001039 '63': '001040' '64': '001066' '65': 001069 '66': '001073' '67': '001075' '68': 001082 '69': 001083 '70': 001087 '71': '001102' '72': 001193 '73': 001195 '74': 001196 '75': 001197 '76': 001249 '77': 001259 '78': '001270' '79': '001276' '80': '001277' '81': 001278 '82': '001417' '83': '001427' '84': '001443' '85': 001482 '86': '001510' '87': '001544' '88': '001642' '89': '001644' '90': 001649 '91': '001661' '92': '001663' '93': '001666' '94': '001673' '95': 001680 '96': 001681 '97': 001682 '98': 001683 '99': 001684 '100': 001685 '101': 001686 '102': 001687 '103': 001688 '104': 001689 '105': '001701' '106': '001702' '107': '001703' '108': '001704' '109': '001706' '110': '001720' '111': '001732' '112': '001733' '113': '001735' '114': '001736' '115': 001883 '116': 001891 '117': 001893 '118': 001924 '119': 001925 '120': 001929 '121': 001930 '122': '002012' '123': 002096 '124': 002097 '125': 002099 '126': '003263' '127': '003264' '128': '003265' '129': '003266' '130': '003270' '131': '003271' '132': '003272' '133': '003273' '134': '003274' '135': 003492 '136': '003532' '137': '003533' '138': '003534' '139': '003535' '140': '003537' '141': 003538 '142': '003573' '143': 003598 '144': '003624' '145': '003707' '146': 003708 '147': '003720' '148': '003721' '149': '003722' '150': '003724' '151': '003725' '152': '003761' '153': '003762' '154': '003763' '155': '003765' '156': '003766' '157': '003775' '158': '003776' '159': '003777' '160': 003778 '161': 003779 '162': 003832 '163': 003833 '164': 003840 '165': 003880 '166': 003895 '167': 003896 '168': 003904 '169': 003905 '170': 003906 '171': 003908 '172': 003909 '173': 003910 '174': 003911 '175': 003912 '176': 003913 '177': 003920 '178': 003921 '179': 003950 '180': '004013' '181': '004017' '182': '004022' '183': '004037' '184': '004066' '185': '004067' '186': 004068 '187': 004069 '188': '004070' '189': '004071' '190': '004072' '191': '004073' '192': '004074' '193': '004075' '194': '004076' '195': '004077' '196': 004078 '197': 004079 '198': 004080 '199': 004091 '200': 004092 '201': 004093 '202': 004094 '203': 004095 '204': 004096 '205': 004097 '206': 004098 '207': 004099 '208': '004100' '209': '004101' '210': '004102' '211': '004103' '212': 004108 '213': '004232' '214': '004233' '215': '004234' '216': '004235' '217': '004236' '218': 004239 '219': '004450' '220': '004507' '221': 004508 '222': 004509 '223': '004510' '224': '004511' '225': 004519 '226': '004520' '227': '004521' '228': '004522' '229': 004682 '230': 004684 '231': 004685 '232': 004688 '233': '004777' '234': 004778 '235': 004779 '236': 004780 '237': 004781 '238': 004782 '239': 004784 '240': 004785 '241': 004786 '242': 004787 '243': 004788 '244': 004799 '245': 004835 '246': 004836 '247': 004838 '248': 004846 '249': 004848 '250': 004849 '251': '005006' '252': '005156' '253': '005157' '254': 005158 '255': 005159 '256': 005169 '257': '005170' '258': '005171' '259': 005191 '260': '005264' '261': 005268 '262': '005376' '263': 005381 '264': '005521' '265': 005879 '266': 005936 '267': 005940 '268': 006329 '269': '006330' '270': '006331' '271': '006332' '272': '006333' '273': '006342' '274': '006354' '275': '006357' '276': 006358 '277': '006360' '278': '006363' '279': '006366' '280': '006367' '281': 006368 '282': '006370' '283': '006372' '284': '006373' '285': '006376' '286': 006379 '287': 006380 '288': 006381 '289': 006382 '290': 006383 '291': 006385 '292': 006387 '293': 006389 '294': 006390 '295': 006393 '296': 006394 '297': 006396 '298': '006406' '299': '006407' '300': 006439 '301': '006440' '302': '006442' '303': '006443' '304': 006448 '305': 006459 '306': '006461' '307': '006463' '308': '006467' '309': 006469 '310': '006517' '311': 006519 '312': '006603' '313': '006605' '314': '006606' '315': '006607' '316': 006608 '317': 006609 '318': '006610' '319': '006611' '320': '006674' '321': '006675' '322': '006677' '323': 006679 '324': 006680 '325': 006684 '326': '006762' '327': '006776' '328': 006778 '329': 006779 '330': 006782 '331': 006783 '332': 006788 '333': 006802 '334': 006803 '335': 006854 '336': 006855 '337': 006856 '338': 006857 '339': '007011' '340': '007373' '341': '007374' '342': '007375' '343': '007376' '344': '007377' '345': 007378 '346': 007379 '347': 007381 '348': 007383 '349': 007385 '350': 007386 '351': 007388 '352': 007391 '353': 007393 '354': 007481 '355': 007482 '356': 007483 '357': 007487 '358': 007488 '359': 007489 '360': 007490 '361': 007491 '362': 007492 '363': 007495 '364': '007526' '365': '007527' '366': 007528 '367': 007529 '368': 007548 '369': '007554' '370': 007709 '371': '007710' '372': '007711' '373': '007712' '374': '007713' '375': 007872 '376': 008056 '377': 008208 '378': 008256 '379': 008259 '380': 008261 '381': 008345 '382': 008357 '383': 008363 '384': 008372 '385': 008416 '386': 009152 '387': 009155 '388': 009307 '389': 009476 '390': 009477 '391': 009491 '392': 009505 '393': 009511 '394': 009512 '395': 009513 '396': 009550 '397': 009553 '398': 009555 '399': 009557 '400': 009559 '401': 009560 '402': 009678 '403': 009721 '404': 009846 '405': 009887 '406': 009888 '407': 009918 '408': 009962 '409': 010186 '410': 010192 '411': '010250' '412': '010374' '413': '010375' '414': '010376' '415': '010377' '416': 010381 '417': 010382 '418': 010383 '419': 010384 '420': 010385 '421': 010386 '422': 010387 '423': 010388 '424': 010389 '425': '010435' '426': 010438 '427': 010439 '428': '010440' '429': '010441' '430': '010442' '431': '010443' '432': '010444' '433': '010447' '434': 010458 '435': 010480 '436': 010481 '437': 010485 '438': '010521' '439': '010527' '440': '010535' '441': '010541' '442': '010575' '443': '010577' '444': 010668 '445': 010669 '446': '010670' '447': '010671' '448': '010672' '449': '010673' '450': '010674' '451': '010675' '452': '010676' '453': '010677' '454': 010678 '455': 010679 '456': 010682 '457': 010684 '458': 010693 '459': 010694 '460': 010695 '461': 010696 '462': 010697 '463': 010698 '464': 010699 '465': 010805 '466': 010806 '467': 010807 '468': 010808 '469': 010809 '470': 010810 '471': 010983 '472': 010992 '473': 010993 '474': 011019 '475': '011020' '476': 011059 '477': 011198 '478': 011199 '479': '011200' '480': '011204' '481': '011206' '482': '011234' '483': '011237' '484': 011239 '485': '011242' '486': '011261' '487': '011262' '488': '011264' '489': 011268 '490': 011298 '491': 011299 '492': '011306' '493': '011333' '494': '011334' '495': '011503' '496': '011504' '497': '011505' '498': 011508 '499': '011544' '500': 011638 '501': '011671' '502': '011672' '503': '011673' '504': '011674' '505': '011675' '506': '011677' '507': 011679 '508': 011681 '509': 011682 '510': 011683 '511': '011763' '512': '011764' '513': '011765' '514': '011766' '515': '011767' '516': 011768 '517': 011769 '518': '011770' '519': '011771' '520': '011772' '521': '011773' '522': '011774' '523': '011775' '524': '011776' '525': '011777' '526': 011778 '527': 011779 '528': 011780 '529': 011781 '530': 011782 '531': 011783 '532': 011784 '533': 011785 '534': 011786 '535': 011787 '536': 011788 '537': 011789 '538': 011790 '539': 011791 '540': 011792 '541': 011793 '542': 011794 '543': 011795 '544': 011803 '545': 011818 '546': 011839 '547': 011861 '548': 011862 '549': 011867 '550': 011868 '551': 011916 '552': 011917 '553': 011918 '554': 011919 '555': 011920 '556': 011921 '557': 011922 '558': 011933 '559': 011937 '560': 011942 '561': 011946 '562': 011947 '563': 011951 '564': '012045' '565': '012046' '566': '012047' '567': 012048 '568': 012049 '569': '012050' '570': '012051' '571': '012052' '572': '012053' '573': 012058 '574': 012059 '575': '012060' '576': '012061' '577': '012062' '578': '012064' '579': '012065' '580': '012066' '581': '012067' '582': 012109 '583': '012146' '584': '012147' '585': '012173' '586': '012174' '587': 012179 '588': 012188 '589': 012189 '590': '012346' '591': 012348 '592': 012349 '593': '012350' '594': '012351' '595': '012352' '596': '012353' '597': '012355' '598': '012376' '599': 012387 '600': 012390 '601': 012394 '602': 012481 '603': 012482 '604': 012484 '605': 012485 '606': 012486 '607': 012487 '608': 012488 '609': 012489 '610': 012490 '611': 012508 '612': '012513' '613': '012514' '614': 012518 '615': '012521' '616': '012526' '617': '012527' '618': '012530' '619': '012531' '620': '012532' '621': '012537' '622': '012551' '623': '012552' '624': '012654' '625': 012690 '626': 012691 '627': 012692 '628': '012737' '629': 012985 '630': 012986 '631': 013191 '632': 013197 '633': 013199 '634': '013201' '635': 013218 '636': '013220' '637': '013325' '638': 013328 '639': '013362' '640': 013378 '641': '013474' '642': '013537' '643': 013538 '644': 013539 '645': '013540' '646': '013556' '647': '013561' '648': '013562' '649': '013566' '650': '013571' '651': 013578 '652': 013591 '653': 013596 '654': '013666' '655': 013668 '656': '013670' '657': '013706' '658': '013707' '659': 013708 '660': 013709 '661': '013710' '662': '013711' '663': '013735' '664': '013747' '665': 013748 '666': 013749 '667': '013767' '668': 013768 '669': 013804 '670': 013927 '671': 013928 '672': 013929 '673': 013930 '674': '014063' '675': 014208 '676': '014315' '677': '014316' '678': '014317' '679': 014318 '680': 014319 '681': '014320' '682': '014344' '683': 014358 '684': '014363' '685': '014365' '686': 014386 '687': 014391 '688': 014538 '689': 014539 '690': '014541' '691': '014542' '692': 014568 '693': 014569 '694': '014570' '695': '014571' '696': '014572' '697': '014576' '698': '014577' '699': 014578 '700': 014579 '701': 014580 '702': 014581 '703': 014583 '704': 014584 '705': 014585 '706': 014586 '707': 014588 '708': 014589 '709': 014590 '710': '014601' '711': '014602' '712': '014603' '713': '014604' '714': '014653' '715': '014661' '716': '014663' '717': 014684 '718': 014690 '719': 014693 '720': '014733' '721': '014734' '722': '014735' '723': '014736' '724': '014737' '725': 014738 '726': 014739 '727': '014740' '728': '014741' '729': '014742' '730': '014743' '731': '014744' '732': '014745' '733': 014809 '734': 014869 '735': 015094 '736': '015210' '737': '015464' '738': 015469 '739': '015471' '740': '015475' '741': '015476' '742': 015487 '743': 015488 '744': '015540' '745': '015541' '746': '015542' '747': '015543' '748': '015625' '749': 015769 '750': '015770' '751': '015771' '752': '015772' '753': '015773' '754': 015880 '755': 016095 '756': '016155' '757': 016158 '758': '016162' '759': '016163' '760': '016334' '761': '016337' '762': 016338 '763': 016339 '764': '016340' '765': '016354' '766': '016743' '767': '016744' '768': '016745' '769': '016747' '770': 016819 '771': 016820 '772': 016821 '773': 016822 '774': 016878 '775': 016879 '776': 016880 '777': 016895 '778': 016994 '779': 016995 '780': 016997 '781': '017132' '782': '017344' '783': '017345' '784': '017462' '785': 017491 '786': 017496 '787': 017499 '788': '017500' '789': '017573' '790': 017588 '791': '017605' '792': '017606' '793': '017607' '794': 017608 '795': 017609 '796': '017610' '797': '017611' '798': '017631' '799': '017632' '800': '017633' '801': '017634' '802': '017635' '803': '017636' '804': '017637' '805': '017644' '806': '017735' '807': 017782 '808': 017884 '809': 017906 '810': 018031 '811': 018032 '812': 018033 '813': 018034 '814': 018037 '815': 018038 '816': 018039 '817': 018043 '818': 018044 '819': 018112 '820': 018124 '821': 018144 '822': 018145 '823': 018146 '824': 018159 '825': 018197 '826': 018350 '827': 018607 '828': 018611 '829': 018876 '830': 018877 '831': 018887 '832': 019073 '833': 019074 '834': 019179 '835': 019184 '836': 019187 '837': 019192 '838': 019412 '839': 019413 '840': 019415 '841': 019416 '842': 019417 '843': 019418 '844': 019420 '845': 019422 '846': 019423 '847': 019425 '848': 019438 '849': 019439 '850': 019441 '851': 019442 '852': 019459 '853': 019673 '854': 019674 '855': 019685 '856': 019689 '857': 019707 '858': 019708 '859': 019729 '860': 019758 '861': 019759 '862': 019760 '863': 019889 '864': 019890 '865': 019891 '866': '020050' '867': 020296 '868': '020361' '869': '020362' '870': '020364' '871': '020365' '872': '020366' '873': 020369 '874': '020372' '875': '020373' '876': '020374' '877': '020375' '878': '020376' '879': '020424' '880': '020432' '881': 020469 '882': '020667' '883': '020704' '884': 020818 '885': 021058 '886': 021085 '887': 021087 '888': '021167' '889': 021228 '890': '021231' '891': '021232' '892': '021400' '893': '021401' '894': '021402' '895': '021403' '896': '021404' '897': 021409 '898': '021422' '899': '021565' '900': 021587 '901': '021657' '902': '021672' '903': '021676' '904': '021677' '905': '021707' '906': '021774' '907': 021842 '908': 021859 '909': 021860 '910': 021891 '911': 021895 '912': 021995 '913': 021996 '914': 021997 '915': 021998 '916': 021999 '917': '022000' '918': '022001' '919': 022088 '920': 022091 '921': 022093 '922': 022094 '923': 022095 '924': 022097 '925': '022150' '926': 022295 '927': 022296 '928': '022315' '929': 022348 '930': '022472' '931': '022473' '932': '022474' '933': '022475' '934': '022476' '935': '022477' '936': 022478 '937': 022479 '938': 022480 '939': 022481 '940': '023010' '941': '023013' '942': '023014' '943': '023015' '944': '023016' '945': '023037' '946': 023039 '947': '023041' '948': '023063' '949': '023155' '950': '023156' '951': '023172' '952': 023329 '953': '023353' '954': '023355' '955': '023371' '956': '023372' '957': '023505' '958': 023862 '959': '024216' '960': '024217' '961': 024218 '962': '024362' '963': '024363' '964': '024364' '965': '024365' '966': '024366' '967': '024367' '968': 024368 '969': 024369 '970': '024370' '971': '024371' '972': 024418 '973': '024420' '974': '024421' '975': '024422' '976': '024423' '977': '024424' '978': '024425' '979': '024426' '980': '024427' '981': 024428 '982': 024429 '983': '024430' '984': '024431' '985': '024432' '986': '024512' '987': '024515' '988': '024521' '989': '024524' '990': 024698 '991': 024699 '992': '024700' '993': '024701' '994': '024702' '995': '024717' '996': '024720' '997': 024739 '998': '024741' '999': '024742' '1000': '024745' '1001': '024746' '1002': '024747' '1003': 024748 '1004': 024749 '1005': 024842 '1006': 024898 '1007': 024899 '1008': 024901 '1009': 024912 '1010': 024915 '1011': 024917 '1012': 024963 '1013': 024975 '1014': 024983 '1015': 025028 '1016': 025029 '1017': '025030' '1018': '025031' '1019': '025032' '1020': '025033' '1021': '025055' '1022': '025063' '1023': '025066' '1024': '025104' '1025': '025124' '1026': '025215' '1027': '025216' '1028': '025227' '1029': '025232' '1030': '025233' '1031': '025234' '1032': '025235' '1033': '025324' '1034': 025378 '1035': '025601' '1036': '025603' '1037': '025605' '1038': '025606' '1039': 025608 '1040': 025609 '1041': 025668 '1042': 025669 '1043': '025670' '1044': 025795 '1045': 025796 '1046': 025797 '1047': 025802 '1048': 025804 '1049': '026007' '1050': 026008 '1051': '026010' '1052': '026011' '1053': '026012' '1054': '026013' '1055': '026014' '1056': '026016' '1057': '026017' '1058': '026020' '1059': '026021' '1060': '026022' '1061': '026025' '1062': '026026' '1063': '026034' '1064': '026035' '1065': '026036' '1066': 026169 '1067': '026174' '1068': 026298 '1069': '026301' '1070': '026302' '1071': '026307' '1072': '026322' '1073': '026464' '1074': '026465' '1075': '026466' '1076': 026583 '1077': '026600' '1078': '026605' '1079': 026629 '1080': 026638 '1081': 026639 '1082': '026640' '1083': '026641' '1084': '026642' '1085': '026643' '1086': '026651' '1087': '026652' '1088': '026653' '1089': '026654' '1090': '026655' '1091': '026656' '1092': '026657' '1093': 026658 '1094': 026659 '1095': '026674' '1096': 026681 '1097': '026754' '1098': '026765' '1099': 026859 '1100': 026861 '1101': 026902 '1102': 026904 '1103': 026905 '1104': 026906 '1105': '027164' '1106': '027177' '1107': 027194 '1108': 027195 '1109': 027197 '1110': 027198 '1111': 027258 '1112': '027406' '1113': '027454' '1114': '027455' '1115': '027456' '1116': '027547' '1117': 027548 '1118': 027549 '1119': '027550' '1120': '027551' '1121': '027552' '1122': 027609 '1123': '027610' '1124': '027611' '1125': '027612' '1126': '027613' '1127': '027667' '1128': '027673' '1129': 027797 '1130': 027798 '1131': 027799 '1132': 027802 '1133': 027803 '1134': 027804 '1135': 027805 '1136': 027855 '1137': 027856 '1138': 027866 '1139': 027945 '1140': 027953 '1141': 027975 '1142': 027978 '1143': 027981 '1144': 027987 '1145': 028070 '1146': 028072 '1147': 028179 '1148': 028241 '1149': 028260 '1150': 028266 '1151': 028274 '1152': 028375 '1153': 028376 '1154': 028477 '1155': 028478 '1156': 028479 '1157': 028480 '1158': 028481 '1159': 028482 '1160': 028483 '1161': 028484 '1162': 028485 '1163': 028546 '1164': 028548 '1165': 028553 '1166': 028571 '1167': 028608 '1168': 028692 '1169': 028802 '1170': 029037 '1171': 029039 '1172': 029040 '1173': 029041 '1174': 029042 '1175': 029043 '1176': 029044 '1177': 029045 '1178': 029128 '1179': 029180 '1180': 029243 '1181': 029245 '1182': 029255 '1183': 029271 '1184': 029272 '1185': 029350 '1186': 029351 '1187': 029355 '1188': 029465 '1189': 029480 '1190': 029526 '1191': 029528 '1192': 029530 '1193': 029587 '1194': 029602 '1195': 029673 '1196': 029718 '1197': 029719 '1198': 029720 '1199': 029721 '1200': 029738 '1201': 029739 '1202': 029740 '1203': 029741 '1204': 029742 '1205': 029744 '1206': 029745 '1207': 029746 '1208': 029747 '1209': 029750 '1210': 029752 '1211': 029807 '1212': 029813 '1213': 029816 '1214': 029961 '1215': 029971 '1216': '030041' '1217': '030043' '1218': '030050' '1219': '030056' '1220': 030058 '1221': 030059 '1222': 030090 '1223': 030095 '1224': '030120' '1225': 030196 '1226': 030198 '1227': '030230' '1228': '030316' '1229': 030486 '1230': 030487 '1231': 030488 '1232': 030519 '1233': '030520' '1234': '030521' '1235': '030522' '1236': '030636' '1237': 030682 '1238': 030690 '1239': '030702' '1240': '030740' '1241': 030895 '1242': '031040' '1243': '031041' '1244': '031042' '1245': '031043' '1246': '031044' '1247': '031165' '1248': '031356' '1249': 031389 '1250': 031390 '1251': 031391 '1252': 031392 '1253': 031568 '1254': 031807 '1255': 031887 '1256': 031888 '1257': 031889 '1258': 031999 '1259': '032001' '1260': '032021' '1261': '032075' '1262': 032081 '1263': 032218 '1264': '032325' '1265': '032326' '1266': '032327' '1267': 032328 '1268': 032329 '1269': '032330' '1270': '032331' '1271': '032332' '1272': '032333' '1273': '032334' '1274': '032335' '1275': '032336' '1276': '032337' '1277': 032338 '1278': 032339 '1279': '032340' '1280': '032433' '1281': '032435' '1282': '032437' '1283': 032438 '1284': 032439 '1285': '032525' '1286': 032686 '1287': 032687 '1288': 032689 '1289': 032693 '1290': 032694 '1291': 032695 '1292': '032755' '1293': '032756' '1294': 032759 '1295': '032760' '1296': 032800 '1297': 032882 '1298': '033020' '1299': 033049 '1300': '033050' '1301': '033064' '1302': '033067' '1303': 033068 '1304': 033069 '1305': '033070' '1306': '033071' '1307': '033072' '1308': '033123' '1309': '033124' '1310': '033203' '1311': '033216' '1312': '033221' '1313': 033278 '1314': '033415' '1315': '033422' '1316': '033424' '1317': '033426' '1318': '033446' '1319': 033459 '1320': '033460' '1321': '033461' '1322': '033465' '1323': '033477' '1324': 033486 '1325': 033538 '1326': 033992 '1327': '034003' '1328': '034147' '1329': '034167' '1330': '034257' '1331': 034258 '1332': '034263' '1333': 034484 '1334': '034510' '1335': '034511' '1336': 034994 '1337': 034996 '1338': '035007' '1339': 035008 '1340': 035182 '1341': 035184 '1342': 035198 '1343': 035199 '1344': '035204' '1345': 035296 '1346': 035299 '1347': '035443' '1348': '035444' '1349': '035462' '1350': '035527' '1351': '035534' '1352': '035535' '1353': '035537' '1354': 035539 '1355': '035541' '1356': '035543' '1357': '035544' '1358': '035545' '1359': 035549 '1360': '035550' '1361': 035569 '1362': '035571' '1363': 035608 '1364': '035734' '1365': 036096 '1366': 036097 '1367': 036099 '1368': '036143' '1369': '036144' '1370': '036145' '1371': '036146' '1372': '036147' '1373': '036245' '1374': '036257' '1375': 036258 '1376': '036261' '1377': '036272' '1378': '036273' '1379': '036275' '1380': '036277' '1381': '036302' '1382': '036304' '1383': '036322' '1384': '036333' '1385': '036371' '1386': 036380 '1387': 036388 '1388': 036428 '1389': '036435' '1390': 036481 '1391': '036526' '1392': '036560' '1393': '036567' '1394': '036614' '1395': '036615' '1396': '036616' '1397': 036618 '1398': '036643' '1399': 036659 '1400': 036799 '1401': 036959 '1402': 036961 '1403': 036965 '1404': 036966 '1405': 036983 '1406': 036984 '1407': 036985 '1408': 036986 '1409': 036987 '1410': 036988 '1411': 036990 '1412': 036992 '1413': 036994 '1414': 036997 '1415': 036998 '1416': 036999 '1417': '037041' '1418': '037111' '1419': '037113' '1420': 037119 '1421': '037121' '1422': '037131' '1423': '037136' '1424': '037141' '1425': '037147' '1426': '037324' '1427': '037325' '1428': 037368 '1429': 037369 '1430': '037416' '1431': '037417' '1432': '037423' '1433': 037538 '1434': 037592 '1435': '037725' '1436': '037727' '1437': '037730' '1438': '037731' '1439': 037779 '1440': 037781 '1441': 037784 '1442': 037859 '1443': 037911 '1444': 037920 '1445': 038312 '1446': 038321 '1447': 038323 '1448': 038326 '1449': 038351 '1450': 038352 '1451': 038353 '1452': 038354 '1453': 038361 '1454': 038362 '1455': 038363 '1456': 038365 '1457': 038399 '1458': 038435 '1459': 038450 '1460': 038522 '1461': 038557 '1462': 038560 '1463': 038775 '1464': 038776 '1465': 038777 '1466': 038778 '1467': 038779 '1468': 038780 '1469': 038781 '1470': 038782 '1471': 038783 '1472': 038784 '1473': 038785 '1474': 038817 '1475': 038818 '1476': 038819 '1477': 038820 '1478': 038821 '1479': 038822 '1480': 038823 '1481': 038824 '1482': 038825 '1483': 038826 '1484': 038827 '1485': 038828 '1486': 038829 '1487': 038830 '1488': 038833 '1489': 038834 '1490': 038847 '1491': 038859 '1492': 038878 '1493': 038879 '1494': 038880 '1495': 038881 '1496': 038882 '1497': 038884 '1498': 038886 '1499': 038887 '1500': 038888 '1501': 038890 '1502': 038891 '1503': 038892 '1504': 038893 '1505': 038894 '1506': 038895 '1507': 038896 '1508': 038898 '1509': 038899 '1510': 038900 '1511': 038901 '1512': 038902 '1513': 038904 '1514': 038905 '1515': 038906 '1516': 038907 '1517': 038908 '1518': 038910 '1519': 038911 '1520': 038912 '1521': 038914 '1522': 038955 '1523': 038961 '1524': 038964 '1525': 038965 '1526': 038966 '1527': 038967 '1528': 039188 '1529': 039259 '1530': 039278 '1531': 039291 '1532': 039298 '1533': 039316 '1534': 039317 '1535': 039318 '1536': 039357 '1537': 039359 '1538': 039378 '1539': 039484 '1540': 039488 '1541': 039530 '1542': 039605 '1543': 039607 '1544': 039658 '1545': 039659 '1546': 039660 '1547': 039661 '1548': 039662 '1549': 039663 '1550': 039664 '1551': 039665 '1552': 039666 '1553': 039667 '1554': 039875 '1555': 039900 '1556': 039904 '1557': '040121' '1558': '040122' '1559': '040123' '1560': '040133' '1561': '040134' '1562': 040139 '1563': '040141' '1564': '040147' '1565': '040161' '1566': 040180 '1567': 040182 '1568': 040229 '1569': '040230' '1570': '040231' '1571': '040232' '1572': '040233' '1573': '040234' '1574': '040235' '1575': '040236' '1576': '040237' '1577': 040238 '1578': 040239 '1579': '040240' '1580': '040241' '1581': '040242' '1582': '040243' '1583': '040244' '1584': '040245' '1585': '040250' '1586': 040509 '1587': '040525' '1588': '040541' '1589': '040542' '1590': 040598 '1591': '040654' '1592': '040655' '1593': '040656' '1594': '040657' '1595': 040658 '1596': 040659 '1597': '040660' '1598': 040683 '1599': '040725' '1600': 040842 '1601': 040843 '1602': 040844 '1603': 040845 '1604': 040851 '1605': 040903 '1606': 040908 '1607': 040909 '1608': 040938 '1609': 040940 '1610': 040984 '1611': 040985 '1612': 040986 '1613': 041018 '1614': 041019 '1615': '041020' '1616': '041054' '1617': 041095 '1618': '041147' '1619': 041191 '1620': 041192 '1621': '041310' '1622': 041381 '1623': 041568 '1624': '041570' '1625': '041573' '1626': '041605' '1627': 041709 '1628': '041714' '1629': 041812 '1630': 041819 '1631': 041820 '1632': 041825 '1633': 041961 '1634': 041962 '1635': 041965 '1636': 041971 '1637': 041983 '1638': '042014' '1639': '042016' '1640': '042017' '1641': 042018 '1642': 042019 '1643': '042020' '1644': '042023' '1645': '042025' '1646': 042029 '1647': '042030' '1648': '042031' '1649': '042040' '1650': '042044' '1651': '042045' '1652': '042046' '1653': 042048 '1654': 042119 '1655': '042126' '1656': 042129 '1657': '042135' '1658': 042138 '1659': 042139 '1660': '042141' '1661': '042146' '1662': '042234' '1663': '042235' '1664': '042236' '1665': 042238 '1666': '042240' '1667': '042241' '1668': '042243' '1669': '042245' '1670': '042247' '1671': '042310' '1672': '042372' '1673': '042373' '1674': '042374' '1675': '042375' '1676': '042376' '1677': '042377' '1678': '042442' '1679': '042463' '1680': '042475' '1681': 042648 '1682': 042659 '1683': '042751' '1684': '042761' '1685': 042789 '1686': 042844 '1687': 042851 '1688': 042911 '1689': 042914 '1690': 042915 '1691': 042966 '1692': 042984 '1693': '043016' '1694': 043018 '1695': 043019 '1696': '043020' '1697': '043021' '1698': '043022' '1699': '043023' '1700': '043024' '1701': '043025' '1702': '043026' '1703': '043027' '1704': 043028 '1705': 043029 '1706': '043030' '1707': '043063' '1708': '043172' '1709': '043173' '1710': '043516' '1711': '043517' '1712': 043518 '1713': 043519 '1714': '043520' '1715': '043521' '1716': '043533' '1717': '043534' '1718': '043535' '1719': '043536' '1720': 043585 '1721': 043586 '1722': 043587 '1723': 043588 '1724': 043589 '1725': 043590 '1726': 043592 '1727': 043593 '1728': 043594 '1729': 043595 '1730': 043596 '1731': 043598 '1732': 043599 '1733': '043600' '1734': 043608 '1735': '043621' '1736': '043623' '1737': 043691 '1738': 043695 '1739': 043696 '1740': 043697 '1741': 043698 '1742': 043699 '1743': '043761' '1744': '043765' '1745': '043766' '1746': '043767' '1747': 043768 '1748': '043773' '1749': 043796 '1750': 043842 '1751': 043843 '1752': 043844 '1753': 043856 '1754': 043857 '1755': 043858 '1756': 043859 '1757': 043860 '1758': 043861 '1759': 043863 '1760': 043865 '1761': 043866 '1762': 043867 '1763': 043868 '1764': 043869 '1765': 043883 '1766': 043886 '1767': 043899 '1768': 043911 '1769': 043962 '1770': 043965 '1771': 044092 '1772': '044110' '1773': 044169 '1774': '044236' '1775': '044342' '1776': '044347' '1777': '044354' '1778': '044355' '1779': '044777' '1780': 044778 '1781': 044779 '1782': 044780 '1783': 044781 '1784': 044782 '1785': 044791 '1786': 044792 '1787': 044793 '1788': 044794 '1789': 044795 '1790': 044796 '1791': 044797 '1792': 044798 '1793': 044799 '1794': 044800 '1795': 044801 '1796': 044802 '1797': 044803 '1798': 044804 '1799': 044805 '1800': 044806 '1801': 044809 '1802': 044820 '1803': 044821 '1804': 044822 '1805': 044823 '1806': 044848 '1807': 044849 '1808': 044850 '1809': 044851 '1810': 044853 '1811': 044854 '1812': 044917 '1813': 044918 '1814': 044946 '1815': 044947 '1816': 044948 '1817': 044949 '1818': 044950 '1819': 044951 '1820': 044952 '1821': '045055' '1822': 045099 '1823': '045100' '1824': '045101' '1825': '045102' '1826': '045103' '1827': 045119 '1828': '045122' '1829': '045125' '1830': '045126' '1831': '045127' '1832': 045128 '1833': 045149 '1834': '045150' '1835': '045151' '1836': '045152' '1837': '045153' '1838': '045154' '1839': '045335' '1840': 045387 '1841': 045388 '1842': 045389 '1843': 045390 '1844': 045391 '1845': 045392 '1846': 045393 '1847': '045474' '1848': '045475' '1849': 045508 '1850': '045513' '1851': '045514' '1852': '045515' '1853': '045516' '1854': '045517' '1855': 045518 '1856': 045519 '1857': '045520' '1858': '045521' '1859': '045522' '1860': '045523' '1861': 045934 '1862': 045941 '1863': '046024' '1864': '046043' '1865': 046058 '1866': 046068 '1867': 046078 '1868': 046079 '1869': '046157' '1870': 046158 '1871': 046159 '1872': '046160' '1873': '046161' '1874': '046162' '1875': 046238 '1876': '046241' '1877': '046525' '1878': '046611' '1879': '046711' '1880': '046717' '1881': 046718 '1882': '046720' '1883': '046726' '1884': '046732' '1885': '046733' '1886': '046736' '1887': 046839 '1888': 046840 '1889': 046841 '1890': 046842 '1891': 046844 '1892': 046846 '1893': 046854 '1894': 046855 '1895': 046928 '1896': 046930 '1897': '047032' '1898': 047068 '1899': 047069 '1900': '047070' '1901': '047071' '1902': '047072' '1903': '047073' '1904': '047074' '1905': '047075' '1906': '047076' '1907': '047077' '1908': '047100' '1909': 047192 '1910': 047193 '1911': 047194 '1912': 047195 '1913': 047196 '1914': 047197 '1915': 047198 '1916': 047199 '1917': '047200' '1918': '047201' '1919': '047202' '1920': '047260' '1921': '047471' '1922': '047506' '1923': '047510' '1924': '047526' '1925': 047628 '1926': '047657' '1927': 047658 '1928': 047659 '1929': '047660' '1930': '047661' '1931': '047662' '1932': '047663' '1933': '047665' '1934': '047666' '1935': '047670' '1936': '047671' '1937': '047707' '1938': 047826 '1939': 047835 '1940': 047865 '1941': 047868 '1942': 047894 '1943': 047895 '1944': 047896 '1945': 047897 '1946': 047916 '1947': 047921 '1948': 048015 '1949': 048042 '1950': 048043 '1951': 048044 '1952': 048046 '1953': 048269 '1954': 048293 '1955': 048307 '1956': 048317 '1957': 048367 '1958': 048368 '1959': 048369 '1960': 048437 '1961': 048439 '1962': 048440 '1963': 048442 '1964': 048443 '1965': 048444 '1966': 048446 '1967': 048450 '1968': 048452 '1969': 048453 '1970': 048454 '1971': 048456 '1972': 048457 '1973': 048462 '1974': 048463 '1975': 048464 '1976': 048465 '1977': 048466 '1978': 048488 '1979': 048489 '1980': 048491 '1981': 048492 '1982': 048493 '1983': 048494 '1984': 048763 '1985': 048808 '1986': 048815 '1987': 048861 '1988': 048862 '1989': 048863 '1990': 048864 '1991': 048865 '1992': 048931 '1993': 048990 '1994': 048999 '1995': 049029 '1996': 049030 '1997': 049039 '1998': 049061 '1999': 049062 '2000': 049064 '2001': 049066 '2002': 049067 '2003': 049068 '2004': 049070 '2005': 049071 '2006': 049072 '2007': 049073 '2008': 049394 '2009': 049401 '2010': 049407 '2011': 049408 '2012': 049441 '2013': 049473 '2014': 049476 '2015': 049477 '2016': 049478 '2017': 049479 '2018': 049812 '2019': 049817 '2020': 049842 '2021': 049843 '2022': 049844 '2023': 049845 '2024': 049846 '2025': 049847 '2026': 049848 '2027': 049849 '2028': 049856 '2029': 049857 '2030': '050264' '2031': '050272' '2032': '050276' '2033': 050283 '2034': '050323' '2035': '050444' '2036': '050445' '2037': '050446' '2038': '050447' '2039': 050448 '2040': 050449 '2041': 050539 '2042': '050543' '2043': '050752' '2044': '050753' '2045': '050754' '2046': 050836 '2047': 050952 '2048': 050955 '2049': 050956 '2050': '051004' '2051': '051005' '2052': '051006' '2053': '051111' '2054': '051112' '2055': '051113' '2056': '051114' '2057': '051115' '2058': '051117' '2059': 051118 '2060': '051120' '2061': '051157' '2062': 051158 '2063': '051203' '2064': '051260' '2065': '051261' '2066': '051262' '2067': '051263' '2068': '051265' '2069': '051267' '2070': 051268 '2071': 051269 '2072': '051271' '2073': '051272' '2074': '051273' '2075': '051274' '2076': '051275' '2077': '051276' '2078': 051278 '2079': 051291 '2080': 051292 '2081': '051301' '2082': '051305' '2083': '051333' '2084': 051479 '2085': '051655' '2086': 051659 '2087': '051661' '2088': '051776' '2089': 051784 '2090': 051785 '2091': 051918 '2092': 051919 '2093': 051923 '2094': 051954 '2095': 051991 '2096': 051992 '2097': 051998 '2098': 051999 '2099': '052000' '2100': '052001' '2101': '052034' '2102': '052035' '2103': '052036' '2104': '052037' '2105': 052039 '2106': '052040' '2107': '052041' '2108': '052042' '2109': '052044' '2110': '052045' '2111': 052118 '2112': 052119 '2113': '052120' '2114': '052121' '2115': '052122' '2116': '052123' '2117': '052124' '2118': '052125' '2119': '052126' '2120': '052127' '2121': 052128 '2122': 052129 '2123': '052141' '2124': '052375' '2125': 052380 '2126': 052389 '2127': 052393 '2128': 052409 '2129': '052446' '2130': '052447' '2131': 052448 '2132': 052449 '2133': '052451' '2134': '052452' '2135': '052500' '2136': '052501' '2137': '052502' '2138': 052508 '2139': '052522' '2140': 052579 '2141': 052628 '2142': 052629 '2143': '052630' '2144': '052631' '2145': '052632' '2146': '052633' '2147': '052634' '2148': '052635' '2149': '052636' '2150': '052637' '2151': 052638 '2152': 052639 '2153': '052641' '2154': '052642' '2155': '052644' '2156': '052645' '2157': '052646' '2158': '052647' '2159': 052648 '2160': 052649 '2161': '052650' '2162': 052859 '2163': 052860 '2164': 052861 '2165': 052862 '2166': 052945 '2167': 052946 '2168': 052947 '2169': 052948 '2170': 052950 '2171': 052951 '2172': 052953 '2173': 052954 '2174': 052955 '2175': '053152' '2176': '053154' '2177': '053156' '2178': '053157' '2179': 053158 '2180': 053159 '2181': '053160' '2182': 053228 '2183': 053229 '2184': 053299 '2185': '053300' '2186': '053301' '2187': '053302' '2188': 053379 '2189': 053381 '2190': '053457' '2191': 053496 '2192': '053576' '2193': 053578 '2194': 053586 '2195': 053587 '2196': 053588 '2197': 053589 '2198': 053591 '2199': 053592 '2200': '053675' '2201': '053723' '2202': '053724' '2203': '053725' '2204': '053726' '2205': '053727' '2206': 053728 '2207': 053729 '2208': 053807 '2209': 053862 '2210': 053863 '2211': 053937 '2212': 054019 '2213': '054031' '2214': '054032' '2215': '054033' '2216': '054034' '2217': '054037' '2218': 054039 '2219': '054061' '2220': '054062' '2221': '054063' '2222': '054064' '2223': 054149 '2224': '054150' '2225': '054151' '2226': '054152' '2227': '054153' '2228': '054154' '2229': '054155' '2230': '054156' '2231': 054158 '2232': 054159 '2233': '054160' '2234': '054163' '2235': '054234' '2236': '054235' '2237': '054236' '2238': '054237' '2239': 054297 '2240': '054335' '2241': '054365' '2242': '054376' '2243': '054433' '2244': '054436' '2245': '054437' '2246': 054438 '2247': '054442' '2248': '054443' '2249': '054463' '2250': '054464' '2251': '054465' '2252': '054466' '2253': '054467' '2254': 054468 '2255': 054469 '2256': '054470' '2257': '054475' '2258': '054476' '2259': 054479 '2260': 054480 '2261': 054481 '2262': 054482 '2263': 054496 '2264': '054554' '2265': 054568 '2266': '054570' '2267': '054576' '2268': 054578 '2269': 054580 '2270': '054621' '2271': '054623' '2272': '054624' '2273': '054625' '2274': '054626' '2275': '054662' '2276': '054664' '2277': '054665' '2278': '054666' '2279': '054667' '2280': '054703' '2281': 054719 '2282': '054735' '2283': '054753' '2284': 054874 '2285': 054942 '2286': '055076' '2287': 055097 '2288': '055100' '2289': '055101' '2290': '055102' '2291': '055113' '2292': 055119 '2293': '055120' '2294': '055121' '2295': '055122' '2296': '055123' '2297': '055124' '2298': 055149 '2299': 055183 '2300': 055186 '2301': '055231' '2302': '055232' '2303': '055233' '2304': '055234' '2305': '055235' '2306': '055236' '2307': '055237' '2308': 055238 '2309': '055240' '2310': '055241' '2311': '055242' '2312': 055285 '2313': 055286 '2314': 055287 '2315': 055288 '2316': 055289 '2317': 055290 '2318': 055291 '2319': 055292 '2320': 055293 '2321': 055294 '2322': 055295 '2323': '055402' '2324': '055430' '2325': '055436' '2326': '055437' '2327': 055480 '2328': 055481 '2329': 055549 '2330': '055572' '2331': 055709 '2332': '055710' '2333': '055711' '2334': '055712' '2335': '055713' '2336': '055714' '2337': '055715' '2338': '055716' '2339': '055717' '2340': 055718 '2341': 055719 '2342': 055782 '2343': 055783 '2344': 055786 '2345': 055807 '2346': 055808 '2347': 055809 '2348': 055810 '2349': 055811 '2350': 055826 '2351': 055827 '2352': 055828 '2353': 055830 '2354': 055831 '2355': 055832 '2356': 055833 '2357': 055900 '2358': '056010' '2359': '056015' '2360': '056020' '2361': 056028 '2362': 056029 '2363': '056030' '2364': '056031' '2365': '056033' '2366': '056034' '2367': '056036' '2368': '056247' '2369': 056248 '2370': 056249 '2371': '056273' '2372': '056274' '2373': '056275' '2374': '056460' '2375': '056465' '2376': '056466' '2377': '056467' '2378': 056468 '2379': 056469 '2380': '056470' '2381': '056471' '2382': '056472' '2383': '056474' '2384': 056493 '2385': 056495 '2386': 056496 '2387': 056497 '2388': 056498 '2389': 056499 '2390': '056516' '2391': '056517' '2392': 056518 '2393': 056519 '2394': '056520' '2395': '056521' '2396': '056523' '2397': '056552' '2398': 056559 '2399': 056639 '2400': '056640' '2401': '056641' '2402': '056645' '2403': '056646' '2404': 056648 '2405': 056649 '2406': '056650' '2407': '056651' '2408': 056686 '2409': 056687 '2410': 056688 '2411': 056689 '2412': 056690 '2413': 056691 '2414': 056692 '2415': 056693 '2416': 056694 '2417': 056695 '2418': 056696 '2419': 056795 '2420': 056796 '2421': 056797 '2422': 056798 '2423': 056799 '2424': 056800 '2425': 056801 '2426': 056802 '2427': 056803 '2428': 056804 '2429': 056805 '2430': 056874 '2431': 056888 '2432': 056895 '2433': 056929 '2434': 057078 '2435': '057164' '2436': '057175' '2437': '057176' '2438': '057177' '2439': 057178 '2440': 057179 '2441': 057180 '2442': '057271' '2443': '057272' '2444': '057273' '2445': '057274' '2446': '057344' '2447': '057360' '2448': '057371' '2449': '057417' '2450': 057418 '2451': '057435' '2452': '057437' '2453': 057439 '2454': '057440' '2455': '057442' '2456': '057500' '2457': '057540' '2458': 057569 '2459': '057626' '2460': '057627' '2461': 057628 '2462': 057629 '2463': '057630' '2464': 057639 '2465': '057640' '2466': 057648 '2467': 057658 '2468': '057661' '2469': '057662' '2470': '057663' '2471': '057665' '2472': 057691 '2473': 057697 '2474': 057819 '2475': 057820 '2476': 057821 '2477': 057822 '2478': 057823 '2479': 057891 '2480': 057892 '2481': 057936 '2482': 057937 '2483': 057938 '2484': 057939 '2485': 057943 '2486': 057968 '2487': 058052 '2488': 058053 '2489': 058054 '2490': 058060 '2491': 058061 '2492': 058063 '2493': 058068 '2494': 058070 '2495': 058115 '2496': 058116 '2497': 058117 '2498': 058135 '2499': 058140 '2500': 058161 '2501': 058162 '2502': 058164 '2503': 058166 '2504': 058169 '2505': 058170 '2506': 058173 '2507': 058174 '2508': 058207 '2509': 058212 '2510': 058213 '2511': 058215 '2512': 058221 '2513': 058225 '2514': 058333 '2515': 058334 '2516': 058341 '2517': 058474 '2518': 058539 '2519': 058540 '2520': 058541 '2521': 058542 '2522': 058543 '2523': 059078 '2524': 059373 '2525': 059374 '2526': 059443 '2527': 059445 '2528': 059446 '2529': 059448 '2530': 059449 '2531': 059451 '2532': 059454 '2533': 059561 '2534': 059562 '2535': 059581 '2536': 059653 '2537': 059654 '2538': 059656 '2539': 059657 '2540': 059658 '2541': 059659 '2542': 059660 '2543': 059663 '2544': 059664 '2545': 059666 '2546': 059667 '2547': 059669 '2548': 059671 '2549': 059673 '2550': 059675 '2551': 059676 '2552': 059677 '2553': 059678 '2554': 059679 '2555': 059680 '2556': 059681 '2557': 059682 '2558': 059683 '2559': 059684 '2560': 059685 '2561': 059686 '2562': 059687 '2563': 059688 '2564': 059695 '2565': 059702 '2566': 059706 '2567': 059707 '2568': 059708 '2569': 059709 '2570': 059710 '2571': 059711 '2572': 059718 '2573': 059719 '2574': 059720 '2575': 059721 '2576': 059723 '2577': 059724 '2578': 059725 '2579': 059726 '2580': 059727 '2581': 059823 '2582': 059876 '2583': 059930 '2584': '060037' '2585': 060038 '2586': '060041' '2587': '060042' '2588': '060045' '2589': 060048 '2590': '060074' '2591': '060143' '2592': '060144' '2593': '060145' '2594': '060146' '2595': '060170' '2596': '060317' '2597': '060331' '2598': '060472' '2599': '060474' '2600': '060476' '2601': '060477' '2602': 060478 '2603': '060510' '2604': '060533' '2605': '060534' '2606': '060535' '2607': '060536' '2608': '060537' '2609': '060544' '2610': '060547' '2611': 060548 '2612': 060549 '2613': '060736' '2614': '060753' '2615': '060754' '2616': '060755' '2617': '060756' '2618': '060757' '2619': 060758 '2620': '060775' '2621': '060776' '2622': '060777' '2623': 060857 '2624': 060864 '2625': 060865 '2626': 060871 '2627': 060872 '2628': 060873 '2629': 060874 '2630': 060875 '2631': 060994 '2632': '061006' '2633': '061007' '2634': 061008 '2635': '061010' '2636': '061011' '2637': '061012' '2638': '061013' '2639': 061159 '2640': '061160' '2641': '061161' '2642': '061172' '2643': '061174' '2644': '061175' '2645': '061452' '2646': '061453' '2647': 061491 '2648': 061492 '2649': 061493 '2650': 061587 '2651': 061589 '2652': 061591 '2653': 061592 '2654': 061668 '2655': '061670' '2656': 061679 '2657': '061734' '2658': '061736' '2659': '061742' '2660': 061814 '2661': 061820 '2662': 061821 '2663': 061884 '2664': '062001' '2665': '062003' '2666': '062005' '2667': '062007' '2668': '062163' '2669': '062164' '2670': '062165' '2671': 062180 '2672': 062183 '2673': 062184 '2674': 062185 '2675': 062186 '2676': 062187 '2677': 062188 '2678': 062189 '2679': 062190 '2680': 062191 '2681': 062192 '2682': 062193 '2683': 062194 '2684': 062195 '2685': 062196 '2686': '062337' '2687': '062426' '2688': '062436' '2689': '062445' '2690': '062446' '2691': 062448 '2692': 062449 '2693': '062450' '2694': '062452' '2695': 062458 '2696': '062525' '2697': '062526' '2698': '062527' '2699': 062528 '2700': 062529 '2701': '062531' '2702': '062532' '2703': '062533' '2704': '062534' '2705': 062586 '2706': 062589 '2707': 062591 '2708': 062592 '2709': 062594 '2710': 062595 '2711': 062596 '2712': '062655' '2713': '062671' '2714': '062742' '2715': 062748 '2716': 062749 '2717': '062750' '2718': '062751' '2719': '062753' '2720': '063043' '2721': '063044' '2722': '063045' '2723': '063064' '2724': '063065' '2725': '063117' '2726': 063149 '2727': 063159 '2728': '063161' '2729': 063191 '2730': 063208 '2731': '063224' '2732': '063226' '2733': '063250' '2734': '063251' '2735': '063252' '2736': '063253' '2737': '063255' '2738': '063257' '2739': 063258 '2740': 063287 '2741': 063289 '2742': 063290 '2743': 063291 '2744': 063292 '2745': '063456' '2746': '063457' '2747': '063470' '2748': '063471' '2749': '063472' '2750': '063626' '2751': '063655' '2752': '063733' '2753': '063747' '2754': '063755' '2755': '063757' '2756': '063770' '2757': 063789 '2758': 063803 '2759': 063804 '2760': 063805 '2761': 063874 '2762': 063900 '2763': 063908 '2764': 063922 '2765': 063936 '2766': 063999 '2767': '064005' '2768': '064006' '2769': '064007' '2770': 064008 '2771': 064009 '2772': '064035' '2773': 064078 '2774': 064079 '2775': 064091 '2776': 064093 '2777': '064247' '2778': 064248 '2779': 064249 '2780': '064252' '2781': '064253' '2782': '064331' '2783': '064332' '2784': '064333' '2785': '064334' '2786': 064338 '2787': '064364' '2788': '064365' '2789': '064366' '2790': '064407' '2791': 064408 '2792': 064409 '2793': '064410' '2794': '064515' '2795': '064516' '2796': '064517' '2797': 064519 '2798': '064520' '2799': '064521' '2800': '064522' '2801': '064523' '2802': '064535' '2803': '064536' '2804': '064537' '2805': 064538 '2806': '064542' '2807': '064553' '2808': '064556' '2809': '064567' '2810': 064590 '2811': 064591 '2812': 064592 '2813': 064593 '2814': 064594 '2815': '064601' '2816': '064604' '2817': 064618 '2818': '064625' '2819': '064626' '2820': '064627' '2821': 064628 '2822': 064629 '2823': '064630' '2824': '064631' '2825': 064659 '2826': 064787 '2827': 064788 '2828': 064789 '2829': 064796 '2830': 064809 '2831': 064834 '2832': 064840 '2833': 064841 '2834': 064854 '2835': 064855 '2836': 064856 '2837': 064857 '2838': 064858 '2839': 064859 '2840': 064860 '2841': 064861 '2842': 064862 '2843': 064863 '2844': 064864 '2845': 064865 '2846': 064866 '2847': 064893 '2848': 064895 '2849': 064896 '2850': 064918 '2851': 064919 '2852': 064988 '2853': 064989 '2854': 064990 '2855': 064991 '2856': 064992 '2857': 064993 '2858': 064994 '2859': 064995 '2860': '065037' '2861': 065038 '2862': 065039 '2863': '065040' '2864': '065063' '2865': '065064' '2866': '065073' '2867': '065076' '2868': '065077' '2869': 065090 '2870': '065234' '2871': '065265' '2872': 065488 '2873': 065619 '2874': 065683 '2875': 065685 '2876': '065745' '2877': '065752' '2878': '065755' '2879': '065756' '2880': '065777' '2881': 065779 '2882': 065780 '2883': 065893 '2884': 066058 '2885': '066073' '2886': '066074' '2887': '066075' '2888': '066076' '2889': 066180 '2890': 066187 '2891': 066390 '2892': 066394 '2893': '066405' '2894': 066469 '2895': 066482 '2896': 066483 '2897': '066525' '2898': '066534' '2899': '066535' '2900': '066536' '2901': '066537' '2902': 066538 '2903': 066539 '2904': '066636' '2905': '066637' '2906': 066638 '2907': '066641' '2908': '066643' '2909': '066644' '2910': '066646' '2911': 066648 '2912': 066649 '2913': '066650' '2914': 066689 '2915': 066690 '2916': '066717' '2917': '066757' '2918': 066782 '2919': 066783 '2920': '067007' '2921': '067010' '2922': '067011' '2923': '067016' '2924': '067017' '2925': '067121' '2926': '067163' '2927': '067232' '2928': '067233' '2929': '067235' '2930': '067237' '2931': 067308 '2932': '067330' '2933': '067331' '2934': '067332' '2935': '067333' '2936': '067334' '2937': '067336' '2938': '067357' '2939': 067358 '2940': 067359 '2941': '067360' '2942': '067361' '2943': '067362' '2944': '067363' '2945': '067364' '2946': '067365' '2947': '067366' '2948': '067367' '2949': 067368 '2950': '067412' '2951': '067457' '2952': '067470' '2953': '067500' '2954': '067553' '2955': '067556' '2956': '067557' '2957': 067558 '2958': 067597 '2959': 067598 '2960': '067600' '2961': '067637' '2962': 067638 '2963': 067639 '2964': '067640' '2965': '067660' '2966': '067661' '2967': '067673' '2968': '067707' '2969': '067760' '2970': '067763' '2971': '067764' '2972': '067765' '2973': '067766' '2974': 067784 '2975': 067793 '2976': 067829 '2977': 068353 '2978': 068354 '2979': 068355 '2980': 068356 '2981': 068404 '2982': 068407 '2983': 068410 '2984': 068444 '2985': 068531 '2986': 068536 '2987': 068537 '2988': 068538 '2989': 068539 '2990': 068540 '2991': 068541 '2992': 068543 '2993': 068549 '2994': 068551 '2995': 068573 '2996': 068579 '2997': 068582 '2998': 068587 '2999': 068592 '3000': 068600 '3001': 068601 '3002': 068680 '3003': 068682 '3004': 068683 '3005': 068820 '3006': 068821 '3007': 068837 '3008': 068838 '3009': 068839 '3010': 068840 '3011': 068841 '3012': 068842 '3013': 068843 '3014': 068844 '3015': 068851 '3016': 068852 '3017': 068853 '3018': 068854 '3019': 068860 '3020': 068861 '3021': 068862 '3022': 068869 '3023': 068872 '3024': 068875 '3025': 068891 '3026': 068892 '3027': 068893 '3028': 068894 '3029': 068895 '3030': 068896 '3031': 068897 '3032': 068898 '3033': 068899 '3034': 068909 '3035': 069001 '3036': 069002 '3037': 069170 '3038': 069181 '3039': 069182 '3040': 069188 '3041': 069193 '3042': 069194 '3043': 069195 '3044': 069196 '3045': 069197 '3046': 069198 '3047': 069199 '3048': 069200 '3049': 069201 '3050': 069202 '3051': 069203 '3052': 069204 '3053': 069205 '3054': 069206 '3055': 069207 '3056': 069208 '3057': 069209 '3058': 069210 '3059': 069211 '3060': 069221 '3061': 069222 '3062': 069223 '3063': 069303 '3064': 069554 '3065': 069555 '3066': 069561 '3067': 069563 '3068': 069564 '3069': 069567 '3070': 069682 '3071': 069723 '3072': 069726 '3073': 069727 '3074': 069732 '3075': 069744 '3076': 069745 '3077': 069746 '3078': 069747 '3079': 069761 '3080': 069762 '3081': 069763 '3082': 069764 '3083': 069765 '3084': 069766 '3085': 069767 '3086': 069768 '3087': 069781 '3088': 069784 '3089': 069785 '3090': 069787 '3091': 069788 '3092': 069789 '3093': 069791 '3094': 069792 '3095': 069793 '3096': 069798 '3097': 069822 '3098': 069823 '3099': 069824 '3100': 069825 '3101': 069826 '3102': 069827 '3103': 069828 '3104': 069830 '3105': 069833 '3106': 069904 '3107': 069947 '3108': 069949 '3109': 069985 '3110': '070002' '3111': '070005' '3112': '070174' '3113': '070206' '3114': '070207' '3115': 070208 '3116': 070299 '3117': '070300' '3118': '070301' '3119': '070302' '3120': '070303' '3121': '070402' '3122': '070403' '3123': 070409 '3124': '070423' '3125': '070424' '3126': '070425' '3127': '070426' '3128': '070654' '3129': '070655' '3130': '070657' '3131': '070660' '3132': 070768 '3133': '070770' '3134': '070772' '3135': '070773' '3136': '070774' '3137': '070775' '3138': 070813 '3139': 070873 '3140': 070875 '3141': 070878 '3142': 070879 '3143': 071096 '3144': '071133' '3145': '071157' '3146': 071158 '3147': '071172' '3148': '071173' '3149': '071174' '3150': '071175' '3151': '071216' '3152': '071225' '3153': 071228 '3154': '071230' '3155': '071231' '3156': '071240' '3157': '071241' '3158': '071242' '3159': '071243' '3160': '071244' '3161': '071245' '3162': '071246' '3163': '071247' '3164': 071248 '3165': 071249 '3166': '071250' '3167': '071251' '3168': '071252' '3169': '071253' '3170': '071254' '3171': '071255' '3172': '071276' '3173': '071303' '3174': '071304' '3175': '071371' '3176': '071372' '3177': '071420' '3178': '071503' '3179': '071506' '3180': '071507' '3181': 071508 '3182': 071509 '3183': '071510' '3184': '071511' '3185': '071512' '3186': '071513' '3187': '071514' '3188': '071515' '3189': '071516' '3190': '071617' '3191': '071620' '3192': '071622' '3193': 071690 '3194': 071691 '3195': 071692 '3196': 071693 '3197': 071694 '3198': 071695 '3199': 071709 '3200': '071711' '3201': '071714' '3202': '071715' '3203': 071719 '3204': '071721' '3205': '071722' '3206': 071822 '3207': 071884 '3208': 071885 '3209': 071937 '3210': 071938 '3211': '072046' '3212': '072047' '3213': '072050' '3214': '072056' '3215': 072058 '3216': 072059 '3217': '072064' '3218': '072067' '3219': 072068 '3220': 072069 '3221': '072070' '3222': '072071' '3223': '072072' '3224': '072073' '3225': '072074' '3226': '072075' '3227': '072076' '3228': 072129 '3229': '072130' '3230': '072131' '3231': '072134' '3232': '072135' '3233': '072136' '3234': '072146' '3235': 072149 '3236': '072200' '3237': '072206' '3238': '072210' '3239': '072215' '3240': '072232' '3241': '072233' '3242': '072234' '3243': 072287 '3244': 072288 '3245': 072289 '3246': 072290 '3247': '072456' '3248': 072468 '3249': '072476' '3250': '072477' '3251': '072513' '3252': '072514' '3253': '072562' '3254': '072565' '3255': '072570' '3256': '072604' '3257': '072605' '3258': '072607' '3259': '072612' '3260': 072738 '3261': 072781 '3262': 072782 '3263': 072783 '3264': 072784 '3265': 072785 '3266': 072786 '3267': 072787 '3268': 072788 '3269': 072789 '3270': 072790 '3271': 072926 '3272': 072927 '3273': 072928 '3274': 072930 '3275': 073087 '3276': 073099 '3277': '073100' '3278': '073123' '3279': '073124' '3280': '073125' '3281': 073169 '3282': '073170' '3283': '073171' '3284': '073172' '3285': '073174' '3286': '073175' '3287': 073192 '3288': 073193 '3289': '073306' '3290': 073309 '3291': 073318 '3292': '073335' '3293': '073340' '3294': '073341' '3295': '073342' '3296': '073343' '3297': '073344' '3298': '073363' '3299': '073365' '3300': '073366' '3301': '073367' '3302': 073368 '3303': 073369 '3304': '073370' '3305': '073371' '3306': '073372' '3307': '073465' '3308': '073466' '3309': '073467' '3310': 073468 '3311': 073469 '3312': 073486 '3313': 073494 '3314': 073495 '3315': 073519 '3316': '073520' '3317': '073521' '3318': '073522' '3319': '073550' '3320': '073551' '3321': '073560' '3322': '073561' '3323': '073564' '3324': '073565' '3325': '073566' '3326': 073568 '3327': '073572' '3328': '073573' '3329': 073580 '3330': 073584 '3331': 073585 '3332': 073587 '3333': 073658 '3334': '073675' '3335': '073760' '3336': '073761' '3337': '073762' '3338': '073763' '3339': '073764' '3340': '073765' '3341': '073766' '3342': '073767' '3343': 073768 '3344': 073769 '3345': '073770' '3346': '073771' '3347': '073772' '3348': '073773' '3349': '073774' '3350': '073775' '3351': '073776' '3352': '073777' '3353': 073778 '3354': 073779 '3355': 073792 '3356': 073797 '3357': 073819 '3358': 073820 '3359': 073821 '3360': 073822 '3361': 073921 '3362': '074002' '3363': '074302' '3364': '074347' '3365': 074348 '3366': '074362' '3367': '074365' '3368': '074370' '3369': '074371' '3370': '074372' '3371': '074373' '3372': '074374' '3373': '074375' '3374': '074376' '3375': '074377' '3376': 074378 '3377': 074380 '3378': 074381 '3379': 074382 '3380': 074383 '3381': 074384 '3382': 074385 '3383': 074386 '3384': 074387 '3385': 074388 '3386': 074389 '3387': 074390 '3388': 074391 '3389': 074392 '3390': 074393 '3391': '074421' '3392': '074445' '3393': '074546' '3394': 074669 '3395': '074671' '3396': '074706' '3397': 074908 '3398': 074937 '3399': 074942 '3400': 074945 '3401': 074954 '3402': 074955 '3403': 074959 '3404': 074960 '3405': 075194 '3406': '075211' '3407': '075221' '3408': '075230' '3409': '075304' '3410': '075310' '3411': '075314' '3412': '075317' '3413': '075371' '3414': '075372' '3415': '075373' '3416': '075374' '3417': '075375' '3418': '075376' '3419': '075377' '3420': 075378 '3421': 075379 '3422': 075380 '3423': 075381 '3424': 075383 '3425': 075386 '3426': 075389 '3427': 075390 '3428': 075391 '3429': 075393 '3430': 075395 '3431': 075396 '3432': 075398 '3433': 075399 '3434': '075401' '3435': '075403' '3436': '075412' '3437': '075415' '3438': '075417' '3439': 075418 '3440': 075419 '3441': '075420' '3442': '075425' '3443': '075427' '3444': 075428 '3445': 075429 '3446': '075430' '3447': '075431' '3448': '075432' '3449': '075433' '3450': '075434' '3451': '075435' '3452': '075436' '3453': '075437' '3454': 075438 '3455': 075439 '3456': '075440' '3457': '075441' '3458': '075442' '3459': '075443' '3460': '075607' '3461': '075612' '3462': 075692 '3463': '075745' '3464': '075746' '3465': '075747' '3466': 075748 '3467': 075749 '3468': '075750' '3469': '075751' '3470': '075752' '3471': '075754' '3472': '075755' '3473': '075762' '3474': '075763' '3475': '075764' '3476': 075782 '3477': 075783 '3478': 075784 '3479': 075785 '3480': 075786 '3481': 075787 '3482': 075788 '3483': 075844 '3484': 075862 '3485': 075866 '3486': 075869 '3487': 075883 '3488': 075903 '3489': 075908 '3490': 075925 '3491': 075926 '3492': 075927 '3493': 075928 '3494': 075929 '3495': 075930 '3496': 075931 '3497': 075932 '3498': 075933 '3499': 075935 '3500': 075936 '3501': 075937 '3502': 075975 '3503': '076036' '3504': 076069 '3505': '076071' '3506': '076072' '3507': '076073' '3508': '076074' '3509': '076075' '3510': '076076' '3511': '076077' '3512': 076078 '3513': 076079 '3514': '076121' '3515': 076128 '3516': 076129 '3517': '076130' '3518': '076131' '3519': '076363' '3520': '076375' '3521': 076381 '3522': '076437' '3523': '076440' '3524': '076654' '3525': 076659 '3526': '077517' '3527': 077519 '3528': '077521' '3529': '077522' '3530': '077523' '3531': '077564' '3532': '077571' '3533': '077572' '3534': 077952 '3535': 078038 '3536': 078156 '3537': 078213 '3538': 078516 '3539': 078833 '3540': 078834 '3541': 078839 '3542': 078841 '3543': 078843 '3544': 078845 '3545': 078847 '3546': 078848 '3547': 078849 '3548': 078850 '3549': 078851 '3550': 078852 '3551': 078984 '3552': 078998 '3553': 079087 '3554': 079575 '3555': 079593 '3556': 079605 '3557': 079606 '3558': 079610 '3559': 079616 '3560': 079741 '3561': 079973 '3562': 079975 '3563': 079977 '3564': 079978 '3565': 079985 '3566': 079986 '3567': 079988 '3568': 079990 '3569': 079995 '3570': 080000 '3571': 080001 '3572': 080002 '3573': 080003 '3574': 080004 '3575': 080005 '3576': 080035 '3577': 080293 '3578': 080341 '3579': 080351 '3580': 080389 '3581': 080402 '3582': 080515 '3583': 080516 '3584': 080517 '3585': 080518 '3586': 080519 '3587': 080520 '3588': 080611 '3589': 080680 '3590': 080686 '3591': 080687 '3592': 080693 '3593': 080694 '3594': 080695 '3595': 080696 '3596': 080697 '3597': 080751 '3598': 080753 '3599': 080754 '3600': 080755 '3601': 080756 '3602': 080758 '3603': 080765 '3604': 080766 '3605': 080772 '3606': 080773 '3607': 080774 '3608': 080775 '3609': 080776 '3610': 080793 '3611': 080833 '3612': 080834 '3613': 080835 '3614': 080836 '3615': 081033 '3616': 081037 '3617': 081071 '3618': 081082 '3619': 081083 '3620': 081084 '3621': 081085 '3622': 081189 '3623': 081193 '3624': 081194 '3625': 081195 '3626': 081362 '3627': 081365 '3628': 081436 '3629': 081457 '3630': 081485 '3631': 081491 '3632': 081512 '3633': 081523 '3634': 081543 '3635': 081554 '3636': 081555 '3637': 081565 '3638': 081576 '3639': 081586 '3640': 081600 '3641': 081612 '3642': 081613 '3643': 081623 '3644': 081638 '3645': 081650 '3646': 081660 '3647': 081781 '3648': 081782 '3649': 081792 '3650': 081802 '3651': 081803 '3652': 081814 '3653': 081868 '3654': 081895 '3655': 081938 '3656': 081945 '3657': 081946 '3658': 081988 '3659': 081999 '3660': 082157 '3661': 082231 '3662': 082237 '3663': 082242 '3664': 082250 '3665': 082410 '3666': 082462 '3667': 082464 '3668': 082505 '3669': 082507 '3670': 082628 '3671': 082629 '3672': 082630 '3673': 082631 '3674': 082778 '3675': 082780 '3676': 082881 '3677': 082886 '3678': 082890 '3679': 082892 '3680': 082893 '3681': 082914 '3682': 082915 '3683': 082916 '3684': 082917 '3685': 082918 '3686': 082919 '3687': 082920 '3688': 082921 '3689': 082928 '3690': 082929 '3691': 082930 '3692': 082931 '3693': 082932 '3694': 083437 '3695': 083438 '3696': 083439 '3697': 083440 '3698': 083507 '3699': 083509 '3700': 083511 '3701': 083512 '3702': 083558 '3703': 083600 '3704': 083612 '3705': 083613 '3706': 083715 '3707': 083717 '3708': 083718 '3709': 083719 '3710': 083789 '3711': 083790 '3712': 083791 '3713': 083898 '3714': 083903 '3715': 083906 '3716': 083908 '3717': 083911 '3718': 083913 '3719': 083954 '3720': 083960 '3721': 083969 '3722': 084009 '3723': 084054 '3724': 084055 '3725': 084056 '3726': 084057 '3727': 084058 '3728': 084091 '3729': 084095 '3730': 084096 '3731': 084097 '3732': 084111 '3733': 084135 '3734': 084136 '3735': 084139 '3736': 084141 '3737': 084142 '3738': 084144 '3739': 084152 '3740': 084154 '3741': 084155 '3742': 084156 '3743': 084157 '3744': 084158 '3745': 084159 '3746': 084195 '3747': 084198 '3748': 084200 '3749': 084201 '3750': 084202 '3751': 084264 '3752': 084290 '3753': 084291 '3754': 084405 '3755': 084417 '3756': 084423 '3757': 084483 '3758': 084484 '3759': 084485 '3760': 084486 '3761': 084605 '3762': 084736 '3763': 084743 '3764': 084757 '3765': 084768 '3766': 084777 '3767': 084788 '3768': 084817 '3769': 085027 '3770': 085038 '3771': 085039 '3772': 085040 '3773': 085041 '3774': 085290 '3775': 085291 '3776': 085307 '3777': 085308 '3778': 085309 '3779': 085310 '3780': 085311 '3781': 085317 '3782': 085318 '3783': 085343 '3784': 085346 '3785': 085347 '3786': 085400 '3787': 085419 '3788': 085420 '3789': 085421 '3790': 085422 '3791': 085423 '3792': 085424 '3793': 085425 '3794': 085426 '3795': 085427 '3796': 085428 '3797': 085436 '3798': 085438 '3799': 085482 '3800': 085484 '3801': 085485 '3802': 085486 '3803': 085487 '3804': 085488 '3805': 085489 '3806': 085490 '3807': 085491 '3808': 085492 '3809': 085494 '3810': 085592 '3811': 085593 '3812': 085594 '3813': 085595 '3814': 085596 '3815': 085598 '3816': 085599 '3817': 085600 '3818': 085691 '3819': 085692 '3820': 085693 '3821': 085787 '3822': 085788 '3823': 085791 '3824': 085792 '3825': 085816 '3826': 085817 '3827': 085822 '3828': 085823 '3829': 085828 '3830': 085831 '3831': 085832 '3832': 085833 '3833': 085834 '3834': 085835 '3835': 085836 '3836': 085837 '3837': 085838 '3838': 085839 '3839': 085840 '3840': 085950 '3841': 085951 '3842': 085952 '3843': 085953 '3844': 085954 '3845': 085955 '3846': 085956 '3847': 085957 '3848': 085963 '3849': 085966 '3850': 085967 '3851': 085968 '3852': 085973 '3853': 086037 '3854': 086038 '3855': 086039 '3856': 086040 '3857': 086077 '3858': 086081 '3859': 086082 '3860': 086116 '3861': 086117 '3862': 086118 '3863': 086119 '3864': 086140 '3865': 086256 '3866': 086259 '3867': 086262 '3868': 086263 '3869': 086415 '3870': 086416 '3871': 086417 '3872': 086419 '3873': 086441 '3874': 086443 '3875': 086481 '3876': 086482 '3877': 086483 '3878': 086484 '3879': 086485 '3880': 086486 '3881': 086487 '3882': 086562 '3883': 086576 '3884': 086623 '3885': 086634 '3886': 086678 '3887': 086679 '3888': 086680 '3889': 086720 '3890': 086721 '3891': 086724 '3892': 086725 '3893': 086730 '3894': 086761 '3895': 086762 '3896': 086763 '3897': 086788 '3898': 086793 '3899': 086795 '3900': 086799 '3901': 086993 '3902': 087068 '3903': 087069 '3904': 087070 '3905': 087096 '3906': 087097 '3907': 087098 '3908': 087099 '3909': 087100 '3910': 087101 '3911': 087102 '3912': 087103 '3913': 087104 '3914': 087105 '3915': 087106 '3916': 087107 '3917': 087108 '3918': 087121 '3919': 087151 '3920': 087152 '3921': 087153 '3922': 087154 '3923': 087155 '3924': 087157 '3925': 087158 '3926': 087159 '3927': 087160 '3928': 087161 '3929': 087185 '3930': 087186 '3931': 087187 '3932': 087188 '3933': 087189 '3934': 087190 '3935': 087191 '3936': 087192 '3937': 087193 '3938': 087194 '3939': 087237 '3940': 087322 '3941': 087323 '3942': 087324 '3943': 087325 '3944': 087361 '3945': 087362 '3946': 087363 '3947': 087377 '3948': 087430 '3949': 087431 '3950': 087490 '3951': 087639 '3952': 087641 '3953': 087642 '3954': 087643 '3955': 087644 '3956': 087645 '3957': 087965 '3958': 087966 '3959': 087967 '3960': 087968 '3961': 087971 '3962': 087972 '3963': 088428 '3964': 088429 '3965': 088485 '3966': 088486 '3967': 088846 '3968': 088848 '3969': 088854 '3970': 088856 '3971': 088858 '3972': 088860 '3973': 088861 '3974': 088863 '3975': 088864 '3976': 088867 '3977': 088868 '3978': 088869 '3979': 088870 '3980': 088871 '3981': 088872 '3982': 088873 '3983': 088874 '3984': 088875 '3985': 088876 '3986': 088877 '3987': 088878 '3988': 088879 '3989': 088892 '3990': 088899 '3991': 088900 '3992': 088959 '3993': 088960 '3994': 089178 '3995': 089179 '3996': 089192 '3997': 089195 '3998': 089196 '3999': 089212 '4000': 089350 '4001': 089376 '4002': 089441 '4003': 089445 '4004': 089447 '4005': 089456 '4006': 089473 '4007': 089474 '4008': 089477 '4009': 089482 '4010': 089484 '4011': 089485 '4012': 089486 '4013': 089639 '4014': 089704 '4015': 089814 '4016': 089815 '4017': 089816 '4018': 089817 '4019': 089841 '4020': 089843 '4021': 089846 '4022': 089847 '4023': 089848 '4024': 089857 '4025': 089859 '4026': 089860 '4027': 089991 '4028': 089992 '4029': 090027 '4030': 090074 '4031': 090278 '4032': 090526 '4033': 090527 '4034': 090529 '4035': 090530 '4036': 090570 '4037': 090579 '4038': 090582 '4039': 090583 '4040': 090587 '4041': 090589 '4042': 090590 '4043': 090591 '4044': 090592 '4045': 090616 '4046': 090617 '4047': 090618 '4048': 090625 '4049': 090639 '4050': 090652 '4051': 090695 '4052': 090804 '4053': 090824 '4054': 090826 '4055': 090828 '4056': 090982 '4057': 090987 '4058': 090993 '4059': 091081 '4060': 091082 '4061': 091083 '4062': 091084 '4063': 091085 '4064': 091086 '4065': 091087 '4066': 091088 '4067': 091089 '4068': 091092 '4069': 091093 '4070': 091098 '4071': 091102 '4072': 091130 '4073': 091157 '4074': 091158 '4075': 091159 '4076': 091160 '4077': 091161 '4078': 091162 '4079': 091163 '4080': 091164 '4081': 091170 '4082': 091177 '4083': 091178 '4084': 091179 '4085': 091181 '4086': 091182 '4087': 091183 '4088': 091184 '4089': 091185 '4090': 091186 '4091': 091187 '4092': 091205 '4093': 091228 '4094': 091238 '4095': 091306 '4096': 091309 '4097': 091312 '4098': 091315 '4099': 091317 '4100': 091318 '4101': 091319 '4102': 091329 '4103': 091349 '4104': 091443 '4105': 091455 '4106': 091458 '4107': 091459 '4108': 091468 '4109': 091471 '4110': 091619 '4111': 091620 '4112': 091621 '4113': 091622 '4114': 091623 '4115': 091624 '4116': 091625 '4117': 091755 '4118': 091788 '4119': 091790 '4120': 091791 '4121': 091793 '4122': 091796 '4123': 091797 '4124': 091851 '4125': 091868 '4126': 091869 '4127': 091894 '4128': 091897 '4129': 091899 '4130': 091900 '4131': 091933 '4132': 091934 '4133': 091936 '4134': 091937 '4135': 091938 '4136': 091958 '4137': 091960 '4138': 092124 '4139': 092125 '4140': 092129 '4141': 092130 '4142': 092131 '4143': 092206 '4144': 092275 '4145': 092282 '4146': 092283 '4147': 092284 '4148': 092292 '4149': 092366 '4150': 092466 '4151': 092508 '4152': 092535 '4153': 092536 '4154': 092538 '4155': 092539 '4156': 092540 '4157': 092546 '4158': 092548 '4159': 092549 '4160': 092551 '4161': 092554 '4162': 092556 '4163': 092561 '4164': 092562 '4165': 092564 '4166': 092565 '4167': 092573 '4168': 092574 '4169': 092868 '4170': 092872 '4171': 092873 '4172': 092874 '4173': 092878 '4174': 092881 '4175': 092885 '4176': 092886 '4177': 092887 '4178': 092888 '4179': 092889 '4180': 092947 '4181': 092948 '4182': 092949 '4183': 092950 '4184': 092951 '4185': 092952 '4186': 092953 '4187': 092954 '4188': 092955 '4189': 093074 '4190': 093075 '4191': 093076 '4192': 093363 '4193': 093364 '4194': 093518 '4195': 093519 '4196': 093520 '4197': 093521 '4198': 093522 '4199': 093523 '4200': 093704 '4201': 093710 '4202': 093712 '4203': 093716 '4204': 093727 '4205': 093867 '4206': 093868 '4207': 093915 '4208': 093917 '4209': 093918 '4210': 093919 '4211': 093920 '4212': 093921 '4213': 093940 '4214': 093941 '4215': 093942 '4216': 093943 '4217': 093944 '4218': 093950 '4219': 093956 '4220': 093981 '4221': 093983 '4222': 093985 '4223': 093986 '4224': 094026 '4225': 094033 '4226': 094034 '4227': 094035 '4228': 094036 '4229': 094037 '4230': 094038 '4231': 094039 '4232': 094093 '4233': 094099 '4234': 094101 '4235': 094102 '4236': 094263 '4237': 094348 '4238': 094411 '4239': 094414 '4240': 094415 '4241': 094419 '4242': 094422 '4243': 094423 '4244': 094426 '4245': 094449 '4246': 094465 '4247': 094467 '4248': 094468 '4249': 094628 '4250': 094630 '4251': 094631 '4252': 094632 '4253': 094634 '4254': 094635 '4255': 094638 '4256': 094803 '4257': 095189 '4258': 095231 '4259': 095248 '4260': 095249 '4261': 095250 '4262': 095251 '4263': 095308 '4264': 095309 '4265': 095310 '4266': 095452 '4267': 095486 '4268': 095506 '4269': 095535 '4270': 095564 '4271': 095722 '4272': 095724 '4273': 095725 '4274': 095726 '4275': 095727 '4276': 095908 '4277': 095910 '4278': 095911 '4279': 095912 '4280': 095914 '4281': 095915 '4282': 096166 '4283': 096167 '4284': 096168 '4285': 096169 '4286': 096399 '4287': 096400 '4288': 096401 '4289': 096402 '4290': 096403 '4291': 096408 '4292': 096560 '4293': 096627 '4294': 096657 '4295': 096675 '4296': 096678 '4297': 096692 '4298': 096693 '4299': 096694 '4300': 096695 '4301': 096696 '4302': 096697 '4303': 096698 '4304': 096699 '4305': 096718 '4306': 096726 '4307': 096728 '4308': 096729 '4309': 096730 '4310': 096731 '4311': 096738 '4312': 096742 '4313': 096743 '4314': 096759 '4315': 096898 '4316': 096900 '4317': 096901 '4318': 096902 '4319': 096935 '4320': 096936 '4321': 096944 '4322': 096945 '4323': 096946 '4324': 097037 '4325': 097041 '4326': 097043 '4327': 097211 '4328': 097215 '4329': 097216 '4330': 097279 '4331': 097283 '4332': 097285 '4333': 097286 '4334': 097373 '4335': 097374 '4336': 097393 '4337': 097404 '4338': 097406 '4339': 097407 '4340': 097424 '4341': 097540 '4342': 097542 '4343': 097544 '4344': 097545 '4345': 097547 '4346': 097548 '4347': 097568 '4348': 097569 '4349': 097570 '4350': 097585 '4351': 097586 '4352': 097587 '4353': 097588 '4354': 097589 '4355': 097590 '4356': 097690 '4357': 097691 '4358': 097692 '4359': 097697 '4360': 097793 '4361': 097794 '4362': 097813 '4363': 097814 '4364': 097841 '4365': 097844 '4366': 097845 '4367': 097846 '4368': 097847 '4369': 097848 '4370': 097886 '4371': 097887 '4372': 097894 '4373': 097940 '4374': 097958 '4375': 097959 '4376': 097960 '4377': 097961 '4378': 097962 '4379': 097980 '4380': 097986 '4381': 097987 '4382': 097988 '4383': 097989 '4384': 098025 '4385': 098026 '4386': 098028 '4387': 098031 '4388': 098077 '4389': 098202 '4390': 098203 '4391': 098204 '4392': 098205 '4393': 098206 '4394': 098227 '4395': 098228 '4396': 098229 '4397': 098235 '4398': 098236 '4399': 098237 '4400': 098238 '4401': 098251 '4402': 098297 '4403': 098298 '4404': 098299 '4405': 098300 '4406': 098301 '4407': 098302 '4408': 098339 '4409': 098346 '4410': 098348 '4411': 098349 '4412': 098547 '4413': 098548 '4414': 098549 '4415': 098550 '4416': 098551 '4417': 098552 '4418': 098553 '4419': 098554 '4420': 098555 '4421': 098556 '4422': 098557 '4423': 098565 '4424': 098567 '4425': 098569 '4426': 098573 '4427': 098574 '4428': 098575 '4429': 098576 '4430': 098577 '4431': 098578 '4432': 098579 '4433': 098580 '4434': 098581 '4435': 098582 '4436': 098583 '4437': 098584 '4438': 098585 '4439': 098613 '4440': 098617 '4441': 098618 '4442': 098619 '4443': 098620 '4444': 098621 '4445': 098622 '4446': 098623 '4447': 098624 '4448': 098625 '4449': 098626 '4450': 098627 '4451': 098628 '4452': 098655 '4453': 098656 '4454': 098657 '4455': 098666 '4456': 098667 '4457': 098668 '4458': 098669 '4459': 098670 '4460': 098671 '4461': 098680 '4462': 098681 '4463': 098701 '4464': 098770 '4465': 098838 '4466': 099041 '4467': 099093 '4468': 099095 '4469': 099096 '4470': 099135 '4471': 099214 '4472': 099260 '4473': 099261 '4474': 099274 '4475': 099311 '4476': 099313 '4477': 099345 '4478': 099361 '4479': 099362 '4480': 099363 '4481': 099364 '4482': 099368 '4483': 099369 '4484': 099370 '4485': 099371 '4486': 099372 '4487': 099373 '4488': 099374 '4489': 099375 '4490': 099389 '4491': 099390 '4492': 099391 '4493': 099392 '4494': 099393 '4495': 099394 '4496': 099395 '4497': 099411 '4498': 099419 '4499': 099436 '4500': 099437 '4501': 099438 '4502': 099439 '4503': 099440 '4504': 099441 '4505': 099442 '4506': 099501 '4507': 099703 '4508': 099704 '4509': 099707 '4510': '100478' '4511': '100479' '4512': '100480' '4513': '100497' '4514': '100522' '4515': '100535' '4516': '100536' '4517': '100544' '4518': '100549' '4519': '100550' '4520': '100552' '4521': '100745' '4522': '100799' '4523': '100802' '4524': '100835' '4525': '100949' '4526': '100958' '4527': '100959' '4528': '100972' '4529': '100973' '4530': '100975' '4531': '100976' '4532': '101111' '4533': '101112' '4534': '101116' '4535': '101118' '4536': '101119' '4537': '101864' '4538': '101868' '4539': '101873' '4540': '101893' '4541': '101951' '4542': '102092' '4543': '102112' '4544': '102114' '4545': '102195' '4546': '103518' '4547': '103519' '4548': '103520' '4549': '103521' '4550': '103522' '4551': '103523' '4552': '103600' '4553': '103800' '4554': '103808' '4555': '104008' '4556': '104009' '4557': '104010' '4558': '104062' '4559': '104063' '4560': '104064' '4561': '104065' '4562': '104066' '4563': '104067' '4564': '104068' '4565': '104086' '4566': '104227' '4567': '104276' '4568': '104277' '4569': '104278' '4570': '104279' '4571': '104282' '4572': '104283' '4573': '104284' '4574': '104356' '4575': '104357' '4576': '104434' '4577': '104625' '4578': '104668' '4579': '104724' '4580': '104725' '4581': '104779' '4582': '104780' '4583': '105022' '4584': '105119' '4585': '105141' '4586': '105142' '4587': '105144' '4588': '105145' '4589': '105196' '4590': '105408' '4591': '105411' '4592': '105412' '4593': '105413' '4594': '105414' '4595': '105443' '4596': '105450' '4597': '105451' '4598': '105662' '4599': '105664' '4600': '105670' '4601': '105671' '4602': '105672' '4603': '105673' '4604': '105674' '4605': '105682' '4606': '105683' '4607': '105685' '4608': '105712' '4609': '105713' '4610': '105714' '4611': '105715' '4612': '105716' '4613': '105717' '4614': '105718' '4615': '105719' '4616': '105720' '4617': '105722' '4618': '105824' '4619': '105825' '4620': '105826' '4621': '105827' '4622': '105887' '4623': '105890' '4624': '105912' '4625': '105914' '4626': '105915' '4627': '105916' '4628': '105917' '4629': '105918' '4630': '105919' '4631': '105920' '4632': '106274' '4633': '106277' '4634': '106339' '4635': '106342' '4636': '106343' '4637': '106456' '4638': '106457' '4639': '106458' '4640': '106463' '4641': '106465' '4642': '106502' '4643': '106522' '4644': '106562' '4645': '106563' '4646': '106564' '4647': '106566' '4648': '106567' '4649': '106568' '4650': '106569' '4651': '106570' '4652': '106571' '4653': '106629' '4654': '106872' '4655': '106876' '4656': '106877' '4657': '106937' '4658': '106948' '4659': '106951' '4660': '106952' '4661': '106953' '4662': '106954' '4663': '106955' '4664': '106956' '4665': '107020' '4666': '107021' '4667': '107025' '4668': '107027' '4669': '107028' '4670': '107029' '4671': '107030' '4672': '107031' '4673': '107046' '4674': '107047' '4675': '107048' '4676': '107049' '4677': '107050' '4678': '107101' '4679': '107125' '4680': '107126' '4681': '107127' '4682': '107128' '4683': '107129' '4684': '107178' '4685': '107179' '4686': '107180' '4687': '107181' '4688': '107182' '4689': '107183' '4690': '107184' '4691': '107185' '4692': '107186' '4693': '107187' '4694': '107188' '4695': '107189' '4696': '107248' '4697': '107249' '4698': '107250' '4699': '107251' '4700': '107256' '4701': '107257' '4702': '107388' '4703': '107389' '4704': '107390' '4705': '107391' '4706': '107425' '4707': '107426' '4708': '107427' '4709': '107429' '4710': '107432' '4711': '107433' '4712': '107434' '4713': '107435' '4714': '107476' '4715': '107506' '4716': '107531' '4717': '107532' '4718': '107533' '4719': '107534' '4720': '107535' '4721': '107567' '4722': '107569' '4723': '107571' '4724': '107574' '4725': '107577' '4726': '107578' '4727': '107579' '4728': '107583' '4729': '107584' '4730': '107588' '4731': '107589' '4732': '107590' '4733': '107591' '4734': '107592' '4735': '107593' '4736': '107594' '4737': '107595' '4738': '107596' '4739': '107597' '4740': '107598' '4741': '107613' '4742': '107616' '4743': '107617' '4744': '107659' '4745': '107799' '4746': '107804' '4747': '107805' '4748': '107809' '4749': '107810' '4750': '107850' '4751': '107851' '4752': '107852' '4753': '107908' '4754': '107909' '4755': '107910' '4756': '107911' '4757': '107912' '4758': '107913' '4759': '107949' '4760': '107950' '4761': '107951' '4762': '107952' '4763': '107953' '4764': '107954' '4765': '107955' '4766': '107956' '4767': '107957' '4768': '108012' '4769': '108014' '4770': '108015' '4771': '108016' '4772': '108017' '4773': '108018' '4774': '108019' '4775': '108020' '4776': '108021' '4777': '108022' '4778': '108023' '4779': '108024' '4780': '108025' '4781': '108026' '4782': '108027' '4783': '108031' '4784': '108036' '4785': '108037' '4786': '108038' '4787': '108049' '4788': '108050' '4789': '108059' '4790': '108060' '4791': '108079' '4792': '108155' '4793': '108230' '4794': '108290' '4795': '108297' '4796': '108298' '4797': '108299' '4798': '108300' '4799': '108301' '4800': '108302' '4801': '108303' '4802': '108304' '4803': '108305' '4804': '108306' '4805': '108307' '4806': '108308' '4807': '108313' '4808': '108314' '4809': '108318' '4810': '108319' '4811': '108339' '4812': '108341' '4813': '108342' '4814': '108343' '4815': '108415' '4816': '108416' '4817': '108418' '4818': '108420' '4819': '108421' '4820': '108422' '4821': '108423' '4822': '108425' '4823': '108426' '4824': '108427' '4825': '108428' '4826': '108429' '4827': '108456' '4828': '108457' '4829': '108459' '4830': '108460' '4831': '108461' '4832': '108464' '4833': '108471' '4834': '108472' '4835': '108473' '4836': '108474' '4837': '108475' '4838': '108476' '4839': '108477' '4840': '108478' '4841': '108487' '4842': '108488' '4843': '108489' '4844': '108490' '4845': '108491' '4846': '108492' '4847': '108493' '4848': '108494' '4849': '108495' '4850': '108496' '4851': '108497' '4852': '108498' '4853': '108499' '4854': '108500' '4855': '108501' '4856': '108502' '4857': '108503' '4858': '108504' '4859': '108505' '4860': '108524' '4861': '108525' '4862': '108526' '4863': '108527' '4864': '108528' '4865': '108529' '4866': '108530' '4867': '108531' '4868': '108532' '4869': '108533' '4870': '108745' '4871': '108774' '4872': '108799' '4873': '108808' '4874': '108809' '4875': '108812' '4876': '108836' '4877': '108837' '4878': '108838' '4879': '108839' '4880': '108840' '4881': '108841' '4882': '108842' '4883': '108843' '4884': '108845' '4885': '108846' '4886': '108847' '4887': '108863' '4888': '108864' '4889': '108865' '4890': '108866' '4891': '108867' '4892': '108868' '4893': '108878' '4894': '108879' '4895': '108880' '4896': '108881' '4897': '108882' '4898': '108883' '4899': '108884' '4900': '108885' '4901': '108906' '4902': '108957' '4903': '108961' '4904': '108962' '4905': '108967' '4906': '108968' '4907': '108969' '4908': '108970' '4909': '108992' '4910': '109068' '4911': '109071' '4912': '109072' '4913': '109106' '4914': '109144' '4915': '109189' '4916': '109191' '4917': '109203' '4918': '109235' '4919': '109276' '4920': '109349' '4921': '109350' '4922': '109355' '4923': '109356' '4924': '109357' '4925': '109445' '4926': '109446' '4927': '109447' '4928': '109448' '4929': '109449' '4930': '109450' '4931': '109468' '4932': '109480' '4933': '109481' '4934': '109497' '4935': '109535' '4936': '109537' '4937': '109538' '4938': '109542' '4939': '109543' '4940': '109548' '4941': '109670' '4942': '109681' '4943': '109684' '4944': '109685' '4945': '109686' '4946': '109687' '4947': '109711' '4948': '109712' '4949': '109896' '4950': '109900' '4951': '109901' '4952': '109902' '4953': '109903' '4954': '109904' '4955': '109905' '4956': '109906' '4957': '109925' '4958': '109957' '4959': '109958' '4960': '109960' '4961': '109962' '4962': '109963' '4963': '109971' '4964': '109972' '4965': '109973' '4966': '109974' '4967': '109975' '4968': '109976' '4969': '109977' '4970': '109978' '4971': '110070' '4972': '110082' '4973': '110084' '4974': '110085' '4975': '110086' '4976': '110102' '4977': '110103' '4978': '110104' '4979': '110105' '4980': '110106' '4981': '110107' '4982': '110108' '4983': '110109' '4984': '110110' '4985': '110111' '4986': '110166' '4987': '110167' '4988': '110171' '4989': '110172' '4990': '110204' '4991': '110205' '4992': '110206' '4993': '110207' '4994': '110208' '4995': '110209' '4996': '110230' '4997': '110259' '4998': '110260' '4999': '110261' '5000': '110262' '5001': '110263' '5002': '110264' '5003': '110265' '5004': '110266' '5005': '110267' '5006': '110274' '5007': '110384' '5008': '110410' '5009': '110417' '5010': '110436' '5011': '110437' '5012': '110438' '5013': '110439' '5014': '110440' '5015': '110441' '5016': '110447' '5017': '110448' '5018': '110449' '5019': '110450' '5020': '110451' '5021': '110452' '5022': '110546' '5023': '110610' '5024': '110611' '5025': '110623' '5026': '110629' '5027': '110630' '5028': '110634' '5029': '110636' '5030': '110637' '5031': '110647' '5032': '110648' '5033': '110649' '5034': '110650' '5035': '110651' '5036': '110652' '5037': '110653' '5038': '110654' '5039': '110681' '5040': '110684' '5041': '110687' '5042': '110688' '5043': '110689' '5044': '110690' '5045': '110691' '5046': '110711' '5047': '110735' '5048': '110736' '5049': '110743' '5050': '110744' '5051': '110756' '5052': '110764' '5053': '110765' '5054': '110768' '5055': '110771' '5056': '110772' '5057': '110774' '5058': '110775' '5059': '110776' '5060': '110777' '5061': '110778' '5062': '110779' '5063': '110923' '5064': '110927' '5065': '110928' '5066': '110980' '5067': '110982' '5068': '110983' '5069': '110985' '5070': '111015' '5071': '111146' '5072': '111147' '5073': '111148' '5074': '111149' '5075': '111150' '5076': '111151' '5077': '111153' '5078': '111154' '5079': '111182' '5080': '111186' '5081': '111187' '5082': '111188' '5083': '111216' '5084': '111220' '5085': '111221' '5086': '111222' '5087': '111223' '5088': '111224' '5089': '111225' '5090': '111226' '5091': '111227' '5092': '111228' '5093': '111229' '5094': '111230' '5095': '111306' '5096': '111311' '5097': '111335' '5098': '111367' '5099': '111368' '5100': '111371' '5101': '111372' '5102': '111375' '5103': '111376' '5104': '111377' '5105': '111378' '5106': '111379' '5107': '111382' '5108': '111385' '5109': '111386' '5110': '111387' '5111': '111388' '5112': '111389' '5113': '111390' '5114': '111391' '5115': '111392' '5116': '111393' '5117': '111394' '5118': '111395' '5119': '111396' '5120': '111397' '5121': '111398' '5122': '111399' '5123': '111400' '5124': '111401' '5125': '111402' '5126': '111413' '5127': '111416' '5128': '111460' '5129': '111579' '5130': '111658' '5131': '111747' '5132': '111793' '5133': '111819' '5134': '111871' '5135': '111872' '5136': '111873' '5137': '111911' '5138': '111933' '5139': '111934' '5140': '111935' '5141': '111936' '5142': '111937' '5143': '111938' '5144': '111974' '5145': '111982' '5146': '111994' '5147': '112000' '5148': '112001' '5149': '112020' '5150': '112065' '5151': '112066' '5152': '112088' '5153': '112133' '5154': '112196' '5155': '112197' '5156': '112198' '5157': '112199' '5158': '112209' '5159': '112210' '5160': '112211' '5161': '112215' '5162': '112252' '5163': '112314' '5164': '112315' '5165': '112316' '5166': '112317' '5167': '112318' '5168': '112468' '5169': '112481' '5170': '112483' '5171': '112484' '5172': '112485' '5173': '112486' '5174': '112487' '5175': '112488' '5176': '112490' '5177': '112526' '5178': '112527' '5179': '112528' '5180': '112529' '5181': '112583' '5182': '112584' '5183': '112585' '5184': '112586' '5185': '112587' '5186': '112588' '5187': '112668' '5188': '112733' '5189': '112734' '5190': '112735' '5191': '112767' '5192': '112768' '5193': '112769' '5194': '112770' '5195': '112780' '5196': '112781' '5197': '112785' '5198': '112788' '5199': '112789' '5200': '112790' '5201': '112821' '5202': '112975' '5203': '112976' '5204': '112977' '5205': '112978' '5206': '113016' '5207': '113017' '5208': '113018' '5209': '113019' '5210': '113020' '5211': '113021' '5212': '113022' '5213': '113023' '5214': '113024' '5215': '113025' '5216': '113026' '5217': '113027' '5218': '113028' '5219': '113030' '5220': '113031' '5221': '113032' '5222': '113033' '5223': '113034' '5224': '113035' '5225': '113036' '5226': '113037' '5227': '113063' '5228': '113110' '5229': '113164' '5230': '113165' '5231': '113166' '5232': '113167' '5233': '113203' '5234': '113259' '5235': '113260' '5236': '113261' '5237': '113262' '5238': '113263' '5239': '113264' '5240': '113265' '5241': '113266' '5242': '113267' '5243': '113268' '5244': '113269' '5245': '113270' '5246': '113271' '5247': '113272' '5248': '113273' '5249': '113274' '5250': '113275' '5251': '113276' '5252': '113277' '5253': '113278' '5254': '113279' '5255': '113280' '5256': '113281' '5257': '113282' '5258': '113284' '5259': '113294' '5260': '113303' '5261': '113304' '5262': '113305' '5263': '113311' '5264': '113334' '5265': '113335' '5266': '113336' '5267': '113342' '5268': '113343' '5269': '113344' '5270': '113357' '5271': '113359' '5272': '113360' '5273': '113453' '5274': '113511' '5275': '113512' '5276': '113513' '5277': '113530' '5278': '113558' '5279': '113564' '5280': '113574' '5281': '113696' '5282': '113697' '5283': '113698' '5284': '113699' '5285': '113700' '5286': '113701' '5287': '113702' '5288': '113787' '5289': '113788' '5290': '113789' '5291': '113790' '5292': '113808' '5293': '113809' '5294': '113810' '5295': '113822' '5296': '113932' '5297': '113933' '5298': '113934' '5299': '113935' '5300': '113946' '5301': '113949' '5302': '113950' '5303': '113969' '5304': '113970' '5305': '113971' '5306': '113972' '5307': '113973' '5308': '114006' '5309': '114007' '5310': '114036' '5311': '114037' '5312': '114040' '5313': '114041' '5314': '114042' '5315': '114044' '5316': '114045' '5317': '114047' '5318': '114048' '5319': '114049' '5320': '114050' '5321': '114051' '5322': '114061' '5323': '114062' '5324': '114063' '5325': '114064' '5326': '114065' '5327': '114066' '5328': '114067' '5329': '114069' '5330': '114070' '5331': '114072' '5332': '114073' '5333': '114074' '5334': '114076' '5335': '114077' '5336': '114198' '5337': '114199' '5338': '114200' '5339': '114201' '5340': '114212' '5341': '114222' '5342': '114223' '5343': '114231' '5344': '114232' '5345': '114233' '5346': '114234' '5347': '114235' '5348': '114236' '5349': '114237' '5350': '114238' '5351': '114239' '5352': '114242' '5353': '114245' '5354': '114265' '5355': '114266' '5356': '114268' '5357': '114272' '5358': '114274' '5359': '114275' '5360': '114279' '5361': '114282' '5362': '114283' '5363': '114289' '5364': '114290' '5365': '114291' '5366': '114292' '5367': '114293' '5368': '114294' '5369': '114295' '5370': '114296' '5371': '114297' '5372': '114298' '5373': '114371' '5374': '114372' '5375': '114373' '5376': '114374' '5377': '114375' '5378': '114384' '5379': '114385' '5380': '114386' '5381': '114387' '5382': '114388' '5383': '114389' '5384': '114390' '5385': '114391' '5386': '114392' '5387': '114393' '5388': '114395' '5389': '114396' '5390': '114397' '5391': '114398' '5392': '114399' '5393': '114400' '5394': '114401' '5395': '114402' '5396': '114403' '5397': '114404' '5398': '114405' '5399': '114406' '5400': '114408' '5401': '114409' '5402': '114410' '5403': '114411' '5404': '114412' '5405': '114413' '5406': '114414' '5407': '114415' '5408': '114416' '5409': '114430' '5410': '114532' '5411': '114533' '5412': '114534' '5413': '114535' '5414': '114536' '5415': '114538' '5416': '114539' '5417': '114541' '5418': '114544' '5419': '114545' '5420': '114556' '5421': '114558' '5422': '114559' '5423': '114879' '5424': '114880' '5425': '114884' '5426': '114936' '5427': '114937' '5428': '114938' '5429': '114939' '5430': '114940' '5431': '114941' '5432': '114942' '5433': '114943' '5434': '114974' '5435': '114976' '5436': '115002' '5437': '115011' '5438': '115125' '5439': '115176' '5440': '115262' '5441': '115263' '5442': '115267' '5443': '115268' '5444': '115269' '5445': '115271' '5446': '115272' '5447': '115273' '5448': '115288' '5449': '115289' '5450': '115290' '5451': '115292' '5452': '115293' '5453': '115294' '5454': '115321' '5455': '115339' '5456': '115391' '5457': '115392' '5458': '115470' '5459': '115471' '5460': '115472' '5461': '115473' '5462': '115474' '5463': '115475' '5464': '115591' '5465': '115592' '5466': '115597' '5467': '115697' '5468': '115698' '5469': '115699' '5470': '115700' '5471': '115721' '5472': '115722' '5473': '115723' '5474': '115724' '5475': '115735' '5476': '115761' '5477': '115762' '5478': '115764' '5479': '115765' '5480': '115766' '5481': '115767' '5482': '115768' '5483': '115769' '5484': '115771' '5485': '115772' '5486': '115773' '5487': '115774' '5488': '115775' '5489': '115811' '5490': '115812' '5491': '115813' '5492': '115814' '5493': '115815' '5494': '115816' '5495': '115817' '5496': '115849' '5497': '115850' '5498': '115852' '5499': '115888' '5500': '115891' '5501': '115892' '5502': '115922' '5503': '115923' '5504': '115925' '5505': '115926' '5506': '115927' '5507': '115930' '5508': '115932' '5509': '115935' '5510': '115944' '5511': '115948' '5512': '116029' '5513': '116068' '5514': '116098' '5515': '116099' '5516': '116101' '5517': '116116' '5518': '116119' '5519': '116175' '5520': '116176' '5521': '116177' '5522': '116235' '5523': '116236' '5524': '116237' '5525': '116238' '5526': '116239' '5527': '116240' '5528': '116241' '5529': '116242' '5530': '116243' '5531': '116261' '5532': '116344' '5533': '116345' '5534': '116372' '5535': '116383' '5536': '116388' '5537': '116389' '5538': '116390' '5539': '116407' '5540': '116446' '5541': '116447' '5542': '116448' '5543': '116449' '5544': '116451' '5545': '116452' '5546': '116453' '5547': '116454' '5548': '116455' '5549': '116456' '5550': '116457' '5551': '116458' '5552': '116464' '5553': '116465' '5554': '116466' '5555': '116467' '5556': '116468' '5557': '116487' '5558': '116488' '5559': '116489' '5560': '116490' '5561': '116491' '5562': '116514' '5563': '116517' '5564': '116525' '5565': '116526' '5566': '116527' '5567': '116528' '5568': '116547' '5569': '116549' '5570': '116586' '5571': '116587' '5572': '116704' '5573': '116706' '5574': '116707' '5575': '116709' '5576': '116733' '5577': '116735' '5578': '116736' '5579': '116753' '5580': '116755' '5581': '116756' '5582': '116757' '5583': '116758' '5584': '116759' '5585': '116760' '5586': '116833' '5587': '116868' '5588': '116869' '5589': '116870' '5590': '116871' '5591': '116872' '5592': '116873' '5593': '116874' '5594': '116876' '5595': '116877' '5596': '116878' '5597': '116879' '5598': '116880' '5599': '116881' '5600': '116882' '5601': '116883' '5602': '117057' '5603': '117159' '5604': '117160' '5605': '117161' '5606': '117169' '5607': '117170' '5608': '117171' '5609': '117172' '5610': '117173' '5611': '117251' '5612': '117252' '5613': '117253' '5614': '117287' '5615': '117288' '5616': '117450' '5617': '117472' '5618': '117473' '5619': '117609' '5620': '117610' '5621': '117611' '5622': '117612' '5623': '117613' '5624': '117614' '5625': '117626' '5626': '117627' '5627': '117628' '5628': '117629' '5629': '117630' '5630': '117631' '5631': '117632' '5632': '117666' '5633': '117667' '5634': '117668' '5635': '117669' '5636': '117670' '5637': '117846' '5638': '117883' '5639': '117884' '5640': '117885' '5641': '117886' '5642': '117887' '5643': '117942' '5644': '117943' '5645': '117944' '5646': '117945' '5647': '117946' '5648': '117961' '5649': '117966' '5650': '117967' '5651': '117970' '5652': '117991' '5653': '118000' '5654': '118012' '5655': '118058' '5656': '118059' '5657': '118060' '5658': '118061' '5659': '118062' '5660': '118063' '5661': '118068' '5662': '118070' '5663': '118084' '5664': '118085' '5665': '118087' '5666': '118195' '5667': '118196' '5668': '118222' '5669': '118223' '5670': '118257' '5671': '118276' '5672': '118277' '5673': '118279' '5674': '118327' '5675': '118384' '5676': '118478' '5677': '118484' '5678': '118489' '5679': '118496' '5680': '118498' '5681': '118499' '5682': '118500' '5683': '118502' '5684': '118503' '5685': '118504' '5686': '118505' '5687': '118507' '5688': '118569' '5689': '118618' '5690': '118629' '5691': '118670' '5692': '118671' '5693': '118672' '5694': '118674' '5695': '118734' '5696': '118735' '5697': '118738' '5698': '118739' '5699': '118886' '5700': '118891' '5701': '118920' '5702': '118921' '5703': '118922' '5704': '118923' '5705': '118950' '5706': '118951' '5707': '118952' '5708': '118953' '5709': '118954' '5710': '118955' '5711': '118957' '5712': '118958' '5713': '118972' '5714': '118986' '5715': '118987' '5716': '118988' '5717': '119025' '5718': '119026' '5719': '119027' '5720': '119063' '5721': '119086' '5722': '119095' '5723': '119097' '5724': '119118' '5725': '119134' '5726': '119187' '5727': '119193' '5728': '119257' '5729': '119369' '5730': '119379' '5731': '119413' '5732': '119545' '5733': '119569' '5734': '119571' '5735': '119574' '5736': '119575' '5737': '119578' '5738': '119579' '5739': '119580' '5740': '119582' '5741': '119583' '5742': '119584' '5743': '119592' '5744': '119715' '5745': '119719' '5746': '119725' '5747': '119726' '5748': '119727' '5749': '119745' '5750': '119828' '5751': '119830' '5752': '119831' '5753': '119893' '5754': '119894' '5755': '119895' '5756': '119896' '5757': '119897' '5758': '119898' '5759': '119899' '5760': '119900' '5761': '119901' '5762': '119922' '5763': '119938' '5764': '119939' '5765': '119940' '5766': '119941' '5767': '119942' '5768': '119979' '5769': '119985' '5770': '119988' '5771': '119991' '5772': '119992' '5773': '119993' '5774': '119994' '5775': '120099' '5776': '120105' '5777': '120109' '5778': '120111' '5779': '120112' '5780': '120150' '5781': '120160' '5782': '120161' '5783': '120171' '5784': '120172' '5785': '120177' '5786': '120178' '5787': '120179' '5788': '120183' '5789': '120184' '5790': '120188' '5791': '120189' '5792': '120194' '5793': '120196' '5794': '120199' '5795': '120200' '5796': '120201' '5797': '120203' '5798': '120206' '5799': '120207' '5800': '120208' '5801': '120296' '5802': '120297' '5803': '120298' '5804': '120299' '5805': '120300' '5806': '120302' '5807': '120303' '5808': '120304' '5809': '120305' '5810': '120306' '5811': '120307' '5812': '120308' '5813': '120309' '5814': '120310' '5815': '120312' '5816': '120313' '5817': '120314' '5818': '120315' '5819': '120316' '5820': '120317' '5821': '120318' '5822': '120319' '5823': '120320' '5824': '120321' '5825': '120322' '5826': '120323' '5827': '120324' '5828': '120325' '5829': '120326' '5830': '120327' '5831': '120328' '5832': '120329' '5833': '120330' '5834': '120331' '5835': '120332' '5836': '120333' '5837': '120462' '5838': '120466' '5839': '120467' '5840': '120468' '5841': '120469' '5842': '120470' '5843': '120471' '5844': '120504' '5845': '120513' '5846': '120514' '5847': '120515' '5848': '120518' '5849': '120769' '5850': '120770' '5851': '120771' '5852': '120772' '5853': '120773' '5854': '120774' '5855': '120775' '5856': '120776' '5857': '120777' '5858': '120778' '5859': '120779' '5860': '120782' '5861': '121251' '5862': '121256' '5863': '121257' '5864': '121273' '5865': '121288' '5866': '121312' '5867': '121313' '5868': '121314' '5869': '121315' '5870': '121316' '5871': '121317' '5872': '121318' '5873': '121319' '5874': '121320' '5875': '121321' '5876': '121322' '5877': '121323' '5878': '121346' '5879': '121366' '5880': '121415' '5881': '121449' '5882': '121450' '5883': '121451' '5884': '121452' '5885': '121453' '5886': '121454' '5887': '121472' '5888': '121473' '5889': '121474' '5890': '121475' '5891': '121570' '5892': '121589' '5893': '121590' '5894': '121591' '5895': '121592' '5896': '121593' '5897': '121594' '5898': '121595' '5899': '121651' '5900': '121652' '5901': '121653' '5902': '121654' '5903': '121655' '5904': '121656' '5905': '121657' '5906': '121658' '5907': '121659' '5908': '121660' '5909': '121661' '5910': '121662' '5911': '121663' '5912': '121664' '5913': '121665' '5914': '121666' '5915': '121734' '5916': '121735' '5917': '121736' '5918': '121737' '5919': '121738' '5920': '121739' '5921': '121740' '5922': '121813' '5923': '121866' '5924': '121867' '5925': '121869' '5926': '121913' '5927': '121915' '5928': '121922' '5929': '121926' '5930': '121929' '5931': '121930' '5932': '121976' '5933': '121985' '5934': '121987' '5935': '121998' '5936': '122001' '5937': '122003' '5938': '122004' '5939': '122066' '5940': '122077' '5941': '122079' '5942': '122080' '5943': '122081' '5944': '122082' '5945': '122083' '5946': '122084' '5947': '122085' '5948': '122086' '5949': '122087' '5950': '122088' '5951': '122106' '5952': '122107' '5953': '122132' '5954': '122143' '5955': '122153' '5956': '122155' '5957': '122166' '5958': '122168' '5959': '122190' '5960': '122199' '5961': '122201' '5962': '122204' '5963': '122247' '5964': '122261' '5965': '122352' '5966': '122353' '5967': '122354' '5968': '122355' '5969': '122356' '5970': '122357' '5971': '122358' '5972': '122359' '5973': '122360' '5974': '122362' '5975': '122363' '5976': '122364' '5977': '122365' '5978': '122395' '5979': '122397' '5980': '122398' '5981': '122399' '5982': '122400' '5983': '122456' '5984': '122457' '5985': '122472' '5986': '122473' '5987': '122474' '5988': '122475' '5989': '122498' '5990': '122499' '5991': '122500' '5992': '122503' '5993': '122504' '5994': '122510' '5995': '122511' '5996': '122533' '5997': '122534' '5998': '122578' '5999': '122579' '6000': '122620' '6001': '122621' '6002': '122622' '6003': '122623' '6004': '122624' '6005': '122625' '6006': '122626' '6007': '122627' '6008': '122628' '6009': '122630' '6010': '122631' '6011': '122632' '6012': '122633' '6013': '122634' '6014': '122635' '6015': '122644' '6016': '122645' '6017': '122646' '6018': '122647' '6019': '122648' '6020': '122649' '6021': '122650' '6022': '122651' '6023': '122654' '6024': '122671' '6025': '122673' '6026': '122675' '6027': '122683' '6028': '122685' '6029': '122686' '6030': '122798' '6031': '122799' '6032': '122800' '6033': '122803' '6034': '122804' '6035': '122805' '6036': '122806' '6037': '122807' '6038': '122808' '6039': '122809' '6040': '122810' '6041': '122832' '6042': '122901' '6043': '122910' '6044': '122911' '6045': '122932' '6046': '122934' '6047': '122935' '6048': '122936' '6049': '122959' '6050': '122999' '6051': '123000' '6052': '123001' '6053': '123002' '6054': '123003' '6055': '123004' '6056': '123094' '6057': '123096' '6058': '123097' '6059': '123099' '6060': '123147' '6061': '123273' '6062': '123278' '6063': '123333' '6064': '123342' '6065': '123427' '6066': '123438' '6067': '123439' '6068': '123440' '6069': '123441' '6070': '123442' '6071': '123458' '6072': '123461' '6073': '123467' '6074': '123468' '6075': '123474' '6076': '123484' '6077': '123485' '6078': '123486' '6079': '123487' '6080': '123488' '6081': '123490' '6082': '123494' '6083': '123501' '6084': '123502' '6085': '123503' '6086': '123504' '6087': '123505' '6088': '123506' '6089': '123509' '6090': '123523' '6091': '123614' '6092': '123641' '6093': '123645' '6094': '123647' '6095': '123760' '6096': '123761' '6097': '123762' '6098': '123763' '6099': '123764' '6100': '123821' '6101': '123825' '6102': '123832' '6103': '123834' '6104': '123835' '6105': '123866' '6106': '123867' '6107': '123868' '6108': '123899' '6109': '123932' '6110': '123933' '6111': '123934' '6112': '123935' '6113': '123936' '6114': '123937' '6115': '123938' '6116': '123964' '6117': '123965' '6118': '123966' '6119': '123968' '6120': '123969' '6121': '123970' '6122': '123971' '6123': '123972' '6124': '123973' '6125': '123974' '6126': '123975' '6127': '123976' '6128': '123977' '6129': '123978' '6130': '123979' '6131': '123980' '6132': '123981' '6133': '123986' '6134': '124154' '6135': '124175' '6136': '124176' '6137': '124177' '6138': '124178' '6139': '124179' '6140': '124180' '6141': '124181' '6142': '124183' '6143': '124184' '6144': '124185' '6145': '124186' '6146': '124201' '6147': '124231' '6148': '124391' '6149': '124392' '6150': '124393' '6151': '124394' '6152': '124409' '6153': '124411' '6154': '124424' '6155': '124425' '6156': '124426' '6157': '124460' '6158': '124461' '6159': '124470' '6160': '124474' '6161': '124477' '6162': '124479' '6163': '124480' '6164': '124481' '6165': '124482' '6166': '124483' '6167': '124484' '6168': '124485' '6169': '124509' '6170': '124517' '6171': '124518' '6172': '124519' '6173': '124554' '6174': '124555' '6175': '124702' '6176': '124752' '6177': '124753' '6178': '124754' '6179': '124755' '6180': '124756' '6181': '124870' '6182': '124872' '6183': '124873' '6184': '124874' '6185': '124875' '6186': '124876' '6187': '124877' '6188': '124891' '6189': '124892' '6190': '124912' '6191': '124913' '6192': '124915' '6193': '124916' '6194': '124917' '6195': '124918' '6196': '124971' '6197': '124992' '6198': '124996' '6199': '125001' '6200': '125002' '6201': '125003' '6202': '125004' '6203': '125154' '6204': '125156' '6205': '125157' '6206': '125158' '6207': '125159' '6208': '125160' '6209': '125161' '6210': '125182' '6211': '125183' '6212': '125185' '6213': '125186' '6214': '125187' '6215': '125188' '6216': '125189' '6217': '125190' '6218': '125191' '6219': '125192' '6220': '125193' '6221': '125194' '6222': '125195' '6223': '125196' '6224': '125237' '6225': '125238' '6226': '125239' '6227': '125240' '6228': '125286' '6229': '125287' '6230': '125288' '6231': '125289' '6232': '125291' '6233': '125293' '6234': '125298' '6235': '125299' '6236': '125312' '6237': '125313' '6238': '125314' '6239': '125315' '6240': '125333' '6241': '125337' '6242': '125375' '6243': '125377' '6244': '125432' '6245': '125551' '6246': '125612' '6247': '125614' '6248': '125616' '6249': '125617' '6250': '125618' '6251': '125620' '6252': '125621' '6253': '125622' '6254': '125657' '6255': '125659' '6256': '125680' '6257': '125681' '6258': '125721' '6259': '125722' '6260': '125723' '6261': '125774' '6262': '125776' '6263': '125777' '6264': '125778' '6265': '125779' '6266': '125809' '6267': '125812' '6268': '125813' '6269': '125814' '6270': '125815' '6271': '125816' '6272': '125817' '6273': '125818' '6274': '125819' '6275': '125820' '6276': '125821' '6277': '125822' '6278': '125823' '6279': '125824' '6280': '125825' '6281': '125826' '6282': '125827' '6283': '125999' '6284': '126014' '6285': '126015' '6286': '126016' '6287': '126017' '6288': '126018' '6289': '126047' '6290': '126055' '6291': '126102' '6292': '126103' '6293': '126104' '6294': '126105' '6295': '126180' '6296': '126181' '6297': '126182' '6298': '126183' '6299': '126185' '6300': '126186' '6301': '126187' '6302': '126188' '6303': '126189' '6304': '126214' '6305': '126215' '6306': '126216' '6307': '126217' '6308': '126218' '6309': '126219' '6310': '126220' '6311': '126221' '6312': '126223' '6313': '126224' '6314': '126225' '6315': '126226' '6316': '126227' '6317': '126229' '6318': '126230' '6319': '126231' '6320': '126232' '6321': '126233' '6322': '126234' '6323': '126240' '6324': '126241' '6325': '126242' '6326': '126243' '6327': '126276' '6328': '126283' '6329': '126289' '6330': '126290' '6331': '126291' '6332': '126292' '6333': '126294' '6334': '126295' '6335': '126297' '6336': '126300' '6337': '126316' '6338': '126317' '6339': '126318' '6340': '126319' '6341': '126320' '6342': '126321' '6343': '126354' '6344': '126357' '6345': '126362' '6346': '126398' '6347': '126400' '6348': '126401' '6349': '126402' '6350': '126403' '6351': '126404' '6352': '126405' '6353': '126406' '6354': '126407' '6355': '126408' '6356': '126409' '6357': '126410' '6358': '126411' '6359': '126412' '6360': '126413' '6361': '126414' '6362': '126415' '6363': '126416' '6364': '126417' '6365': '126425' '6366': '126426' '6367': '126427' '6368': '126428' '6369': '126429' '6370': '126430' '6371': '126431' '6372': '126455' '6373': '126489' '6374': '126490' '6375': '126491' '6376': '126505' '6377': '126506' '6378': '126507' '6379': '126508' '6380': '126510' '6381': '126512' '6382': '126516' '6383': '126519' '6384': '126520' '6385': '126521' '6386': '126522' '6387': '126550' '6388': '126557' '6389': '126559' '6390': '126584' '6391': '126585' '6392': '126586' '6393': '126587' '6394': '126588' '6395': '126589' '6396': '126598' '6397': '126600' '6398': '126601' '6399': '126602' '6400': '126603' '6401': '126605' '6402': '126606' '6403': '126607' '6404': '126608' '6405': '126646' '6406': '126666' '6407': '126667' '6408': '126668' '6409': '126669' '6410': '126670' '6411': '126671' '6412': '126672' '6413': '126673' '6414': '126674' '6415': '126675' '6416': '126676' '6417': '126716' '6418': '126717' '6419': '126718' '6420': '126719' '6421': '126720' '6422': '126743' '6423': '126746' '6424': '126747' '6425': '126748' '6426': '126749' '6427': '126773' '6428': '126778' '6429': '126781' '6430': '126782' '6431': '126786' '6432': '126789' '6433': '126790' '6434': '126882' '6435': '126883' '6436': '126884' '6437': '126885' '6438': '126886' '6439': '126887' '6440': '126899' '6441': '126900' '6442': '126944' '6443': '126979' '6444': '127036' '6445': '127037' '6446': '127062' '6447': '127066' '6448': '127155' '6449': '127159' '6450': '127180' '6451': '127181' '6452': '127182' '6453': '127183' '6454': '127184' '6455': '127185' '6456': '127186' '6457': '127187' '6458': '127188' '6459': '127189' '6460': '127190' '6461': '127191' '6462': '127192' '6463': '127193' '6464': '127194' '6465': '127203' '6466': '127204' '6467': '127205' '6468': '127206' '6469': '127207' '6470': '127208' '6471': '127209' '6472': '127210' '6473': '127211' '6474': '127212' '6475': '127263' '6476': '127265' '6477': '127266' '6478': '127267' '6479': '127268' '6480': '127269' '6481': '127271' '6482': '127273' '6483': '127274' '6484': '127275' '6485': '127276' '6486': '127277' '6487': '127278' '6488': '127279' '6489': '127280' '6490': '127281' '6491': '127285' '6492': '127286' '6493': '127287' '6494': '127288' '6495': '127289' '6496': '127290' '6497': '127294' '6498': '127295' '6499': '127296' '6500': '127297' '6501': '127298' '6502': '127299' '6503': '127300' '6504': '127301' '6505': '127302' '6506': '127303' '6507': '127330' '6508': '127331' '6509': '127339' '6510': '127343' '6511': '127349' '6512': '127350' '6513': '127356' '6514': '127357' '6515': '127358' '6516': '127359' '6517': '127360' '6518': '127402' '6519': '127422' '6520': '127469' '6521': '127484' '6522': '127494' '6523': '127495' '6524': '127496' '6525': '127497' '6526': '127498' '6527': '127499' '6528': '127519' '6529': '127520' '6530': '127532' '6531': '127541' '6532': '127542' '6533': '127559' '6534': '127620' '6535': '127623' '6536': '127648' '6537': '127660' '6538': '127661' '6539': '127662' '6540': '127663' '6541': '127720' '6542': '127722' '6543': '127726' '6544': '127798' '6545': '127804' '6546': '127806' '6547': '127865' '6548': '127866' '6549': '127867' '6550': '127868' '6551': '127869' '6552': '127870' '6553': '127871' '6554': '127878' '6555': '127908' '6556': '127909' '6557': '127910' '6558': '127911' '6559': '127912' '6560': '127913' '6561': '127914' '6562': '127915' '6563': '127916' '6564': '127936' '6565': '127996' '6566': '128441' '6567': '128443' '6568': '128448' '6569': '128469' '6570': '128470' '6571': '128471' '6572': '128472' '6573': '128473' '6574': '128476' '6575': '128477' '6576': '128482' '6577': '128484' '6578': '128494' '6579': '128500' '6580': '128504' '6581': '128619' '6582': '128666' '6583': '128668' '6584': '128699' '6585': '128709' '6586': '128710' '6587': '128711' '6588': '128758' '6589': '128759' '6590': '128760' '6591': '128799' '6592': '128811' '6593': '128812' '6594': '128813' '6595': '128814' '6596': '128815' '6597': '128816' '6598': '128825' '6599': '128827' '6600': '128828' '6601': '128835' '6602': '128845' '6603': '128878' '6604': '128879' '6605': '128880' '6606': '128881' '6607': '128882' '6608': '128885' '6609': '128886' '6610': '128887' '6611': '128888' '6612': '128927' '6613': '128992' '6614': '129039' '6615': '129040' '6616': '129042' '6617': '129043' '6618': '129044' '6619': '129046' '6620': '129048' '6621': '129049' '6622': '129051' '6623': '129052' '6624': '129053' '6625': '129054' '6626': '129055' '6627': '129056' '6628': '129088' '6629': '129089' '6630': '129090' '6631': '129091' '6632': '129092' '6633': '129093' '6634': '129094' '6635': '129095' '6636': '129096' '6637': '129097' '6638': '129098' '6639': '129184' '6640': '129185' '6641': '129186' '6642': '129187' '6643': '129188' '6644': '129189' '6645': '129190' '6646': '129268' '6647': '129362' '6648': '129372' '6649': '129374' '6650': '129375' '6651': '129391' '6652': '129392' '6653': '129393' '6654': '129395' '6655': '129396' '6656': '129397' '6657': '129398' '6658': '129399' '6659': '129400' '6660': '129401' '6661': '129402' '6662': '129403' '6663': '129404' '6664': '129405' '6665': '129406' '6666': '129407' '6667': '129439' '6668': '129442' '6669': '129444' '6670': '129620' '6671': '129622' '6672': '129624' '6673': '129674' '6674': '129675' '6675': '129683' '6676': '129694' '6677': '129695' '6678': '129696' '6679': '129742' '6680': '129806' '6681': '129807' '6682': '129808' '6683': '129816' '6684': '129874' '6685': '129875' '6686': '129876' '6687': '129879' '6688': '129880' '6689': '129882' '6690': '129883' '6691': '129884' '6692': '129885' '6693': '129886' '6694': '129887' '6695': '129889' '6696': '129904' '6697': '129910' '6698': '129914' '6699': '129915' '6700': '129918' '6701': '129919' '6702': '129920' '6703': '129922' '6704': '129923' '6705': '129924' '6706': '129925' '6707': '129926' '6708': '129927' '6709': '129962' '6710': '129968' '6711': '129969' '6712': '129970' '6713': '129972' '6714': '129973' '6715': '129997' '6716': '130016' '6717': '130084' '6718': '130129' '6719': '130130' '6720': '130131' '6721': '130132' '6722': '130133' '6723': '130134' '6724': '130135' '6725': '130136' '6726': '130137' '6727': '130168' '6728': '130170' '6729': '130218' '6730': '130265' '6731': '130347' '6732': '130349' '6733': '130367' '6734': '130368' '6735': '130369' '6736': '130370' '6737': '130371' '6738': '130372' '6739': '130440' '6740': '130454' '6741': '130456' '6742': '130650' '6743': '130667' '6744': '130682' '6745': '130683' '6746': '130689' '6747': '130691' '6748': '130692' '6749': '130693' '6750': '130702' '6751': '130709' '6752': '130710' '6753': '130711' '6754': '130752' '6755': '130758' '6756': '130920' '6757': '130921' '6758': '130922' '6759': '130923' '6760': '130927' '6761': '130929' '6762': '130930' '6763': '130931' '6764': '130932' '6765': '130933' '6766': '130934' '6767': '130937' '6768': '130940' '6769': '130944' '6770': '130945' '6771': '130948' '6772': '130950' '6773': '130951' '6774': '130952' '6775': '130953' '6776': '130954' '6777': '130955' '6778': '130956' '6779': '130963' '6780': '130964' '6781': '130986' '6782': '130988' '6783': '130989' '6784': '130990' '6785': '130991' '6786': '130992' '6787': '130993' '6788': '131016' '6789': '131019' '6790': '131020' '6791': '131021' '6792': '131024' '6793': '131166' '6794': '131292' '6795': '131323' '6796': '131324' '6797': '131325' '6798': '131326' '6799': '131327' '6800': '131385' '6801': '131410' '6802': '131422' '6803': '131425' '6804': '131426' '6805': '131436' '6806': '131439' '6807': '131444' '6808': '131446' '6809': '131448' '6810': '131449' '6811': '131451' '6812': '131452' '6813': '131453' '6814': '131454' '6815': '131476' '6816': '131536' '6817': '131540' '6818': '131552' '6819': '131553' '6820': '131554' '6821': '131567' '6822': '131624' '6823': '131656' '6824': '131657' '6825': '131658' '6826': '131764' '6827': '131767' '6828': '131770' '6829': '131771' '6830': '131772' '6831': '131773' '6832': '131774' '6833': '131787' '6834': '131789' '6835': '131791' '6836': '131792' '6837': '131794' '6838': '131795' '6839': '131796' '6840': '131797' '6841': '131837' '6842': '131897' '6843': '131899' '6844': '131900' '6845': '131901' '6846': '131902' '6847': '131903' '6848': '131904' '6849': '131911' '6850': '131912' '6851': '131913' '6852': '131914' '6853': '131917' '6854': '131918' '6855': '131919' '6856': '131922' '6857': '131923' '6858': '131924' '6859': '131925' '6860': '131932' '6861': '131933' '6862': '131934' '6863': '131935' '6864': '131936' '6865': '131938' '6866': '131939' '6867': '131940' '6868': '131941' '6869': '131942' '6870': '131950' '6871': '131951' '6872': '131952' '6873': '131953' '6874': '131978' '6875': '131979' '6876': '131980' '6877': '131982' '6878': '131983' '6879': '131984' '6880': '131985' '6881': '131986' '6882': '132019' '6883': '132040' '6884': '132041' '6885': '132042' '6886': '132045' '6887': '132117' '6888': '132118' '6889': '132122' '6890': '132134' '6891': '132138' '6892': '132139' '6893': '132140' '6894': '132141' '6895': '132142' '6896': '132171' '6897': '132272' '6898': '132310' '6899': '132420' '6900': '132424' '6901': '132434' '6902': '132436' '6903': '132448' '6904': '132449' '6905': '132453' '6906': '132454' '6907': '132455' '6908': '132456' '6909': '132561' '6910': '132566' '6911': '132567' '6912': '132568' '6913': '132589' '6914': '132675' '6915': '132677' '6916': '132678' '6917': '132679' '6918': '132773' '6919': '132774' '6920': '132775' '6921': '132778' '6922': '132779' '6923': '132781' '6924': '132784' '6925': '132786' '6926': '132787' '6927': '132788' '6928': '132789' '6929': '132790' '6930': '132791' '6931': '132792' '6932': '132793' '6933': '132794' '6934': '132795' '6935': '132914' '6936': '132954' '6937': '132961' '6938': '132962' '6939': '132963' '6940': '132964' '6941': '132965' '6942': '133015' '6943': '133016' '6944': '133019' '6945': '133020' '6946': '133022' '6947': '133023' '6948': '133024' '6949': '133025' '6950': '133026' '6951': '133027' '6952': '133028' '6953': '133029' '6954': '133100' '6955': '133102' '6956': '133272' '6957': '133273' '6958': '133274' '6959': '133275' '6960': '133276' '6961': '133293' '6962': '133294' '6963': '133332' '6964': '133333' '6965': '133431' '6966': '133432' '6967': '133433' '6968': '133434' '6969': '133435' '6970': '133436' '6971': '133437' '6972': '133438' '6973': '133439' '6974': '133440' '6975': '133441' '6976': '133442' '6977': '133443' '6978': '133444' '6979': '133445' '6980': '133446' '6981': '133447' '6982': '133448' '6983': '133449' '6984': '133450' '6985': '133451' '6986': '133452' '6987': '133453' '6988': '133454' '6989': '133455' '6990': '133456' '6991': '133457' '6992': '133459' '6993': '133479' '6994': '133535' '6995': '133537' '6996': '133538' '6997': '133544' '6998': '133545' '6999': '133546' '7000': '133551' '7001': '133553' '7002': '133560' '7003': '133561' '7004': '133562' '7005': '133563' '7006': '133564' '7007': '133567' '7008': '133571' '7009': '133572' '7010': '133573' '7011': '133574' '7012': '133576' '7013': '133579' '7014': '133580' '7015': '133632' '7016': '133638' '7017': '133639' '7018': '133681' '7019': '133729' '7020': '133731' '7021': '133770' '7022': '133772' '7023': '133780' '7024': '133781' '7025': '133788' '7026': '133793' '7027': '133798' '7028': '133802' '7029': '133803' '7030': '133833' '7031': '133835' '7032': '133836' '7033': '133837' '7034': '133838' '7035': '133916' '7036': '133942' '7037': '133943' '7038': '133967' '7039': '133968' '7040': '133969' '7041': '133970' '7042': '133971' '7043': '133972' '7044': '133973' '7045': '133974' '7046': '133975' '7047': '133976' '7048': '133977' '7049': '133978' '7050': '134034' '7051': '134052' '7052': '134053' '7053': '134054' '7054': '134073' '7055': '134077' '7056': '134084' '7057': '134094' '7058': '134359' '7059': '134384' '7060': '134385' '7061': '134388' '7062': '134389' '7063': '134443' '7064': '134444' '7065': '134445' '7066': '134446' '7067': '134447' '7068': '134448' '7069': '134449' '7070': '134452' '7071': '134453' '7072': '134454' '7073': '134455' '7074': '134486' '7075': '134509' '7076': '134510' '7077': '134580' '7078': '134586' '7079': '134594' '7080': '134610' '7081': '134631' '7082': '134643' '7083': '134790' '7084': '134791' '7085': '134792' '7086': '134793' '7087': '134794' '7088': '134795' '7089': '134796' '7090': '134797' '7091': '134801' '7092': '134823' '7093': '134824' '7094': '134825' '7095': '134826' '7096': '134827' '7097': '134918' '7098': '134919' '7099': '134922' '7100': '134923' '7101': '134928' '7102': '134929' '7103': '134930' '7104': '134931' '7105': '134932' '7106': '134933' '7107': '134934' '7108': '134935' '7109': '134936' '7110': '134937' '7111': '134938' '7112': '134939' '7113': '134940' '7114': '134941' '7115': '134942' '7116': '134943' '7117': '134947' '7118': '134948' '7119': '134949' '7120': '134950' '7121': '134951' '7122': '134952' '7123': '134956' '7124': '134959' '7125': '134962' '7126': '134979' '7127': '134981' '7128': '135010' '7129': '135028' '7130': '135039' '7131': '135043' '7132': '135044' '7133': '135054' '7134': '135089' '7135': '135091' '7136': '135092' '7137': '135219' '7138': '135220' '7139': '135221' '7140': '135222' '7141': '135223' '7142': '135224' '7143': '135225' '7144': '135226' '7145': '135227' '7146': '135228' '7147': '135229' '7148': '135336' '7149': '135337' '7150': '135338' '7151': '135339' '7152': '135340' '7153': '135341' '7154': '135342' '7155': '135363' '7156': '135364' '7157': '135365' '7158': '135368' '7159': '135369' '7160': '135370' '7161': '135371' '7162': '135372' '7163': '135373' '7164': '135374' '7165': '135375' '7166': '135986' '7167': '135989' '7168': '135990' '7169': '136054' '7170': '136091' '7171': '136094' '7172': '136134' '7173': '136137' '7174': '136138' '7175': '136275' '7176': '136276' '7177': '136320' '7178': '136321' '7179': '136322' '7180': '136323' '7181': '136324' '7182': '136331' '7183': '136404' '7184': '136424' '7185': '136449' '7186': '136465' '7187': '136466' '7188': '136467' '7189': '136468' '7190': '136469' '7191': '136705' '7192': '136706' '7193': '136707' '7194': '136708' '7195': '136709' '7196': '136928' '7197': '136994' '7198': '136995' '7199': '137054' '7200': '137151' '7201': '137152' '7202': '137166' '7203': '137167' '7204': '137168' '7205': '137169' '7206': '137170' '7207': '137171' '7208': '137172' '7209': '137173' '7210': '137174' '7211': '137175' '7212': '137176' '7213': '137211' '7214': '137212' '7215': '137213' '7216': '137214' '7217': '137356' '7218': '137417' '7219': '137418' '7220': '137419' '7221': '137423' '7222': '137424' '7223': '137425' '7224': '137426' '7225': '137462' '7226': '137463' '7227': '137484' '7228': '137500' '7229': '137551' '7230': '137561' '7231': '137563' '7232': '137567' '7233': '137593' '7234': '137605' '7235': '137624' '7236': '137627' '7237': '137630' '7238': '137631' '7239': '137632' '7240': '137715' '7241': '137716' '7242': '137717' '7243': '137719' '7244': '137720' '7245': '137721' '7246': '137722' '7247': '137723' '7248': '137724' '7249': '137725' '7250': '137740' '7251': '137895' '7252': '137896' '7253': '137898' '7254': '137899' '7255': '137900' '7256': '137901' '7257': '137907' '7258': '137935' '7259': '137990' '7260': '137998' '7261': '138010' '7262': '138015' '7263': '138016' '7264': '138017' '7265': '138018' '7266': '138019' '7267': '138020' '7268': '138021' '7269': '138022' '7270': '138023' '7271': '138024' '7272': '138025' '7273': '138026' '7274': '138038' '7275': '138039' '7276': '138040' '7277': '138041' '7278': '138053' '7279': '138060' '7280': '138061' '7281': '138062' '7282': '138063' '7283': '138064' '7284': '138065' '7285': '138066' '7286': '138067' '7287': '138068' '7288': '138069' '7289': '138070' '7290': '138071' '7291': '138207' '7292': '138210' '7293': '138211' '7294': '138212' '7295': '138213' '7296': '138215' '7297': '138216' '7298': '138217' '7299': '138218' '7300': '138256' '7301': '138282' '7302': '138306' '7303': '138311' '7304': '138317' '7305': '138318' '7306': '138319' '7307': '138320' '7308': '138351' '7309': '138355' '7310': '138406' '7311': '138410' '7312': '138413' '7313': '138414' '7314': '138415' '7315': '138416' '7316': '138578' '7317': '138579' '7318': '138580' '7319': '138581' '7320': '139003' '7321': '139043' '7322': '139110' '7323': '139112' '7324': '139117' '7325': '139123' '7326': '139226' '7327': '139329' '7328': '139330' '7329': '139461' '7330': '139485' '7331': '139491' '7332': '139520' '7333': '139521' '7334': '139522' '7335': '139523' '7336': '139524' '7337': '139532' '7338': '139534' '7339': '139536' '7340': '139537' '7341': '139637' '7342': '139638' '7343': '139663' '7344': '139681' '7345': '139687' '7346': '139688' '7347': '139769' '7348': '139770' '7349': '139771' '7350': '139772' '7351': '139773' '7352': '139774' '7353': '139775' '7354': '139776' '7355': '139777' '7356': '139804' '7357': '139862' '7358': '139876' '7359': '139933' '7360': '139934' '7361': '139935' '7362': '139936' '7363': '139937' '7364': '139954' '7365': '139990' '7366': '139991' '7367': '139992' '7368': '139993' '7369': '139994' '7370': '139995' '7371': '140043' '7372': '140134' '7373': '140135' '7374': '140258' '7375': '140259' '7376': '140260' '7377': '140261' '7378': '140262' '7379': '140263' '7380': '140266' '7381': '140316' '7382': '140344' '7383': '140421' '7384': '140564' '7385': '140565' '7386': '140566' '7387': '140576' '7388': '140583' '7389': '140584' '7390': '140609' '7391': '140620' '7392': '140621' '7393': '140623' '7394': '140625' '7395': '140626' '7396': '140788' '7397': '140789' '7398': '140790' '7399': '140791' '7400': '140794' '7401': '140871' '7402': '140872' '7403': '140873' '7404': '140874' '7405': '140875' '7406': '140922' '7407': '140923' '7408': '140924' '7409': '140925' '7410': '140926' '7411': '140933' '7412': '140934' '7413': '140935' '7414': '140939' '7415': '141074' '7416': '141137' '7417': '141139' '7418': '141141' '7419': '141143' '7420': '141144' '7421': '141164' '7422': '141166' '7423': '141167' '7424': '141168' '7425': '141173' '7426': '141179' '7427': '141180' '7428': '141181' '7429': '141182' '7430': '141264' '7431': '141282' '7432': '141283' '7433': '141284' '7434': '141285' '7435': '141286' '7436': '141287' '7437': '141288' '7438': '141289' '7439': '141290' '7440': '141291' '7441': '141292' '7442': '141293' '7443': '141295' '7444': '141296' '7445': '141297' '7446': '141299' '7447': '141300' '7448': '141303' '7449': '141304' '7450': '141310' '7451': '141375' '7452': '141561' '7453': '141562' '7454': '141564' '7455': '141566' '7456': '141567' '7457': '141568' '7458': '141569' '7459': '141590' '7460': '141591' '7461': '141592' '7462': '141593' '7463': '141594' '7464': '141616' '7465': '141617' '7466': '141618' '7467': '141619' '7468': '141735' '7469': '141873' '7470': '141874' '7471': '141875' '7472': '141876' '7473': '141877' '7474': '141878' '7475': '141894' '7476': '141901' '7477': '141902' '7478': '141903' '7479': '141972' '7480': '142078' '7481': '142079' '7482': '142080' '7483': '142081' '7484': '142082' '7485': '142083' '7486': '142084' '7487': '142085' '7488': '142086' '7489': '142087' '7490': '142088' '7491': '142089' '7492': '142091' '7493': '142092' '7494': '142093' '7495': '142094' '7496': '142096' '7497': '142097' '7498': '142098' '7499': '142128' '7500': '142129' '7501': '142132' '7502': '142133' '7503': '142358' '7504': '142359' '7505': '142360' '7506': '142361' '7507': '142362' '7508': '142381' '7509': '142402' '7510': '142418' '7511': '142433' '7512': '142511' '7513': '142516' '7514': '142517' '7515': '142519' '7516': '142528' '7517': '142529' '7518': '142530' '7519': '142531' '7520': '142532' '7521': '142533' '7522': '142534' '7523': '142535' '7524': '142536' '7525': '142537' '7526': '142538' '7527': '142539' '7528': '142549' '7529': '142550' '7530': '142551' '7531': '142552' '7532': '142553' '7533': '142563' '7534': '142564' '7535': '142565' '7536': '142566' '7537': '142567' '7538': '142568' '7539': '142569' '7540': '142570' '7541': '142571' '7542': '142572' '7543': '142573' '7544': '142574' '7545': '142575' '7546': '142576' '7547': '142577' '7548': '142579' '7549': '142641' '7550': '142666' '7551': '142668' '7552': '142669' '7553': '142670' '7554': '142671' '7555': '142672' '7556': '142947' '7557': '142948' '7558': '142949' '7559': '142950' '7560': '143039' '7561': '143046' '7562': '143055' '7563': '143056' '7564': '143057' '7565': '143058' '7566': '143059' '7567': '143060' '7568': '143061' '7569': '143095' '7570': '143097' '7571': '143098' '7572': '143099' '7573': '143106' '7574': '143186' '7575': '143214' '7576': '143215' '7577': '143216' '7578': '143217' '7579': '143218' '7580': '143219' '7581': '143220' '7582': '143221' '7583': '143237' '7584': '143239' '7585': '143290' '7586': '143295' '7587': '143296' '7588': '143299' '7589': '143300' '7590': '143303' '7591': '143304' '7592': '143305' '7593': '143306' '7594': '143307' '7595': '143308' '7596': '143309' '7597': '143318' '7598': '143319' '7599': '143532' '7600': '143941' '7601': '143989' '7602': '143995' '7603': '144170' '7604': '144171' '7605': '144172' '7606': '144173' '7607': '144179' '7608': '144180' '7609': '144181' '7610': '144182' '7611': '144212' '7612': '144213' '7613': '144214' '7614': '144215' '7615': '144216' '7616': '144423' '7617': '144424' '7618': '144454' '7619': '144465' '7620': '144466' '7621': '144467' '7622': '144468' '7623': '144469' '7624': '144470' '7625': '144471' '7626': '144472' '7627': '144473' '7628': '144474' '7629': '144475' '7630': '144476' '7631': '144477' '7632': '144487' '7633': '144492' '7634': '144542' '7635': '144543' '7636': '144544' '7637': '144545' '7638': '144546' '7639': '144547' '7640': '144548' '7641': '144549' '7642': '144550' '7643': '144551' '7644': '144552' '7645': '144587' '7646': '144592' '7647': '144600' '7648': '144733' '7649': '144740' '7650': '144741' '7651': '144801' '7652': '144809' '7653': '144810' '7654': '144933' '7655': '144934' '7656': '144935' '7657': '144936' '7658': '144937' '7659': '144938' '7660': '144939' '7661': '144940' '7662': '144941' '7663': '144942' '7664': '144943' '7665': '144944' '7666': '144945' '7667': '144946' '7668': '145002' '7669': '145003' '7670': '145004' '7671': '145005' '7672': '145020' '7673': '145027' '7674': '145041' '7675': '145042' '7676': '145043' '7677': '145058' '7678': '145059' '7679': '145067' '7680': '145068' '7681': '145074' '7682': '145183' '7683': '145189' '7684': '145199' '7685': '145241' '7686': '145257' '7687': '145258' '7688': '145259' '7689': '145260' '7690': '145431' '7691': '145432' '7692': '145457' '7693': '145458' '7694': '145462' '7695': '145464' '7696': '145475' '7697': '145476' '7698': '145477' '7699': '145549' '7700': '145550' '7701': '145551' '7702': '145552' '7703': '145553' '7704': '145554' '7705': '145555' '7706': '145556' '7707': '145606' '7708': '145607' '7709': '145608' '7710': '145609' '7711': '145610' '7712': '145645' '7713': '145646' '7714': '145653' '7715': '145702' '7716': '145703' '7717': '145704' '7718': '145705' '7719': '145706' '7720': '145707' '7721': '145708' '7722': '145709' '7723': '145710' '7724': '145711' '7725': '145724' '7726': '145727' '7727': '145728' '7728': '145729' '7729': '145730' '7730': '145741' '7731': '145742' '7732': '145743' '7733': '145744' '7734': '145745' '7735': '145746' '7736': '145747' '7737': '145748' '7738': '145749' '7739': '145750' '7740': '145751' '7741': '145752' '7742': '145754' '7743': '145755' '7744': '145756' '7745': '145757' '7746': '145758' '7747': '145759' '7748': '145760' '7749': '145761' '7750': '145762' '7751': '145777' '7752': '145780' '7753': '145783' '7754': '145887' '7755': '145917' '7756': '145918' '7757': '146017' '7758': '146018' '7759': '146019' '7760': '146020' '7761': '146070' '7762': '146147' '7763': '146148' '7764': '146149' '7765': '146150' '7766': '146151' '7767': '146152' '7768': '146153' '7769': '146343' '7770': '146458' '7771': '146478' '7772': '146481' '7773': '146482' '7774': '146483' '7775': '146639' '7776': '146681' '7777': '146683' '7778': '146685' '7779': '146687' '7780': '146689' '7781': '146713' '7782': '146716' '7783': '146724' '7784': '146725' '7785': '146726' '7786': '146727' '7787': '146879' '7788': '146961' '7789': '146968' '7790': '146969' '7791': '146970' '7792': '146988' '7793': '146989' '7794': '147020' '7795': '147021' '7796': '147022' '7797': '147023' '7798': '147024' '7799': '147059' '7800': '147085' '7801': '147086' '7802': '147087' '7803': '147126' '7804': '147191' '7805': '147261' '7806': '147265' '7807': '147267' '7808': '147268' '7809': '147269' '7810': '147295' '7811': '147309' '7812': '147409' '7813': '147412' '7814': '147413' '7815': '147780' '7816': '147815' '7817': '147886' '7818': '147956' '7819': '148002' '7820': '148028' '7821': '148031' '7822': '148032' '7823': '148066' '7824': '148070' '7825': '148074' '7826': '148075' '7827': '148076' '7828': '148077' '7829': '148078' '7830': '148079' '7831': '148082' '7832': '148099' '7833': '148112' '7834': '148113' '7835': '148114' '7836': '148120' '7837': '148121' '7838': '148124' '7839': '148130' '7840': '148131' '7841': '148132' '7842': '148133' '7843': '148168' '7844': '148186' '7845': '148187' '7846': '148190' '7847': '148208' '7848': '148210' '7849': '148211' '7850': '148212' '7851': '148213' '7852': '148214' '7853': '148215' '7854': '148216' '7855': '148217' '7856': '148218' '7857': '148231' '7858': '148233' '7859': '148234' '7860': '148235' '7861': '148246' '7862': '148285' '7863': '148286' '7864': '148287' '7865': '148288' '7866': '148289' '7867': '148290' '7868': '148302' '7869': '148303' '7870': '148305' '7871': '148429' '7872': '148430' '7873': '148439' '7874': '148441' '7875': '148443' '7876': '148444' '7877': '148510' '7878': '148513' '7879': '148514' '7880': '148516' '7881': '148517' '7882': '148518' '7883': '148519' '7884': '148532' '7885': '148535' '7886': '148536' '7887': '148537' '7888': '148584' '7889': '148585' '7890': '148586' '7891': '148587' '7892': '148602' '7893': '148603' '7894': '148604' '7895': '148605' '7896': '148606' '7897': '148607' '7898': '148608' '7899': '148609' '7900': '148610' '7901': '148611' '7902': '148612' '7903': '148613' '7904': '148773' '7905': '149075' '7906': '149078' '7907': '149082' '7908': '149083' '7909': '149099' '7910': '149100' '7911': '149101' '7912': '149102' '7913': '149103' '7914': '149118' '7915': '149124' '7916': '149138' '7917': '149139' '7918': '149140' '7919': '149141' '7920': '149142' '7921': '149143' '7922': '149185' '7923': '149369' '7924': '149370' '7925': '149416' '7926': '149417' '7927': '149422' '7928': '149452' '7929': '149488' '7930': '149523' '7931': '149623' '7932': '149625' '7933': '149626' '7934': '149687' '7935': '149689' '7936': '149690' '7937': '149700' '7938': '149701' '7939': '149712' '7940': '149714' '7941': '149727' '7942': '149750' '7943': '149775' '7944': '149776' '7945': '149777' '7946': '149778' '7947': '149842' '7948': '149951' '7949': '149953' '7950': '150015' '7951': '150017' '7952': '150018' '7953': '150062' '7954': '150063' '7955': '150064' '7956': '150073' '7957': '150078' '7958': '150079' '7959': '150080' '7960': '150265' '7961': '150266' '7962': '150267' '7963': '150268' '7964': '150287' '7965': '150288' '7966': '151404' '7967': '152103' '7968': '152253' '7969': '152254' '7970': '152258' '7971': '152261' '7972': '152262' '7973': '152324' '7974': '152418' '7975': '152425' '7976': '152480' '7977': '152543' '7978': '152545' '7979': '152568' '7980': '152569' '7981': '152570' '7982': '153337' '7983': '153383' '7984': '153452' '7985': '153946' '7986': '153955' '7987': '153956' '7988': '154303' '7989': '154305' '7990': '154306' '7991': '154307' '7992': '154308' '7993': '154309' '7994': '154413' '7995': '154414' '7996': '155066' splits: - name: train num_bytes: 440849403.831 num_examples: 7997 download_size: 432687499 dataset_size: 440849403.831 configs: - config_name: default data_files: - split: train path: data/train-* ---
colettemb/288-demo
--- license: pddl ---
BangumiBase/fullmetalalchemist
--- license: mit tags: - art size_categories: - 1K<n<10K --- # Bangumi Image Base of Fullmetal Alchemist This is the image base of bangumi Fullmetal Alchemist, we detected 44 characters, 5107 images in total. The full dataset is [here](all.zip). **Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview: | # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 | |:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------| | 0 | 1190 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) | | 1 | 164 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) | | 2 | 20 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | ![preview 7](2/preview_7.png) | ![preview 8](2/preview_8.png) | | 3 | 61 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | ![preview 7](3/preview_7.png) | ![preview 8](3/preview_8.png) | | 4 | 56 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | ![preview 8](4/preview_8.png) | | 5 | 80 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) | | 6 | 384 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) | | 7 | 427 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | ![preview 8](7/preview_8.png) | | 8 | 179 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) | | 9 | 73 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) | | 10 | 93 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) | | 11 | 33 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | ![preview 8](11/preview_8.png) | | 12 | 84 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) | | 13 | 113 | [Download](13/dataset.zip) | ![preview 1](13/preview_1.png) | ![preview 2](13/preview_2.png) | ![preview 3](13/preview_3.png) | ![preview 4](13/preview_4.png) | ![preview 5](13/preview_5.png) | ![preview 6](13/preview_6.png) | ![preview 7](13/preview_7.png) | ![preview 8](13/preview_8.png) | | 14 | 95 | [Download](14/dataset.zip) | ![preview 1](14/preview_1.png) | ![preview 2](14/preview_2.png) | ![preview 3](14/preview_3.png) | ![preview 4](14/preview_4.png) | ![preview 5](14/preview_5.png) | ![preview 6](14/preview_6.png) | ![preview 7](14/preview_7.png) | ![preview 8](14/preview_8.png) | | 15 | 129 | [Download](15/dataset.zip) | ![preview 1](15/preview_1.png) | ![preview 2](15/preview_2.png) | ![preview 3](15/preview_3.png) | ![preview 4](15/preview_4.png) | ![preview 5](15/preview_5.png) | ![preview 6](15/preview_6.png) | ![preview 7](15/preview_7.png) | ![preview 8](15/preview_8.png) | | 16 | 318 | [Download](16/dataset.zip) | ![preview 1](16/preview_1.png) | ![preview 2](16/preview_2.png) | ![preview 3](16/preview_3.png) | ![preview 4](16/preview_4.png) | ![preview 5](16/preview_5.png) | ![preview 6](16/preview_6.png) | ![preview 7](16/preview_7.png) | ![preview 8](16/preview_8.png) | | 17 | 187 | [Download](17/dataset.zip) | ![preview 1](17/preview_1.png) | ![preview 2](17/preview_2.png) | ![preview 3](17/preview_3.png) | ![preview 4](17/preview_4.png) | ![preview 5](17/preview_5.png) | ![preview 6](17/preview_6.png) | ![preview 7](17/preview_7.png) | ![preview 8](17/preview_8.png) | | 18 | 26 | [Download](18/dataset.zip) | ![preview 1](18/preview_1.png) | ![preview 2](18/preview_2.png) | ![preview 3](18/preview_3.png) | ![preview 4](18/preview_4.png) | ![preview 5](18/preview_5.png) | ![preview 6](18/preview_6.png) | ![preview 7](18/preview_7.png) | ![preview 8](18/preview_8.png) | | 19 | 48 | [Download](19/dataset.zip) | ![preview 1](19/preview_1.png) | ![preview 2](19/preview_2.png) | ![preview 3](19/preview_3.png) | ![preview 4](19/preview_4.png) | ![preview 5](19/preview_5.png) | ![preview 6](19/preview_6.png) | ![preview 7](19/preview_7.png) | ![preview 8](19/preview_8.png) | | 20 | 78 | [Download](20/dataset.zip) | ![preview 1](20/preview_1.png) | ![preview 2](20/preview_2.png) | ![preview 3](20/preview_3.png) | ![preview 4](20/preview_4.png) | ![preview 5](20/preview_5.png) | ![preview 6](20/preview_6.png) | ![preview 7](20/preview_7.png) | ![preview 8](20/preview_8.png) | | 21 | 54 | [Download](21/dataset.zip) | ![preview 1](21/preview_1.png) | ![preview 2](21/preview_2.png) | ![preview 3](21/preview_3.png) | ![preview 4](21/preview_4.png) | ![preview 5](21/preview_5.png) | ![preview 6](21/preview_6.png) | ![preview 7](21/preview_7.png) | ![preview 8](21/preview_8.png) | | 22 | 53 | [Download](22/dataset.zip) | ![preview 1](22/preview_1.png) | ![preview 2](22/preview_2.png) | ![preview 3](22/preview_3.png) | ![preview 4](22/preview_4.png) | ![preview 5](22/preview_5.png) | ![preview 6](22/preview_6.png) | ![preview 7](22/preview_7.png) | ![preview 8](22/preview_8.png) | | 23 | 97 | [Download](23/dataset.zip) | ![preview 1](23/preview_1.png) | ![preview 2](23/preview_2.png) | ![preview 3](23/preview_3.png) | ![preview 4](23/preview_4.png) | ![preview 5](23/preview_5.png) | ![preview 6](23/preview_6.png) | ![preview 7](23/preview_7.png) | ![preview 8](23/preview_8.png) | | 24 | 142 | [Download](24/dataset.zip) | ![preview 1](24/preview_1.png) | ![preview 2](24/preview_2.png) | ![preview 3](24/preview_3.png) | ![preview 4](24/preview_4.png) | ![preview 5](24/preview_5.png) | ![preview 6](24/preview_6.png) | ![preview 7](24/preview_7.png) | ![preview 8](24/preview_8.png) | | 25 | 217 | [Download](25/dataset.zip) | ![preview 1](25/preview_1.png) | ![preview 2](25/preview_2.png) | ![preview 3](25/preview_3.png) | ![preview 4](25/preview_4.png) | ![preview 5](25/preview_5.png) | ![preview 6](25/preview_6.png) | ![preview 7](25/preview_7.png) | ![preview 8](25/preview_8.png) | | 26 | 12 | [Download](26/dataset.zip) | ![preview 1](26/preview_1.png) | ![preview 2](26/preview_2.png) | ![preview 3](26/preview_3.png) | ![preview 4](26/preview_4.png) | ![preview 5](26/preview_5.png) | ![preview 6](26/preview_6.png) | ![preview 7](26/preview_7.png) | ![preview 8](26/preview_8.png) | | 27 | 246 | [Download](27/dataset.zip) | ![preview 1](27/preview_1.png) | ![preview 2](27/preview_2.png) | ![preview 3](27/preview_3.png) | ![preview 4](27/preview_4.png) | ![preview 5](27/preview_5.png) | ![preview 6](27/preview_6.png) | ![preview 7](27/preview_7.png) | ![preview 8](27/preview_8.png) | | 28 | 58 | [Download](28/dataset.zip) | ![preview 1](28/preview_1.png) | ![preview 2](28/preview_2.png) | ![preview 3](28/preview_3.png) | ![preview 4](28/preview_4.png) | ![preview 5](28/preview_5.png) | ![preview 6](28/preview_6.png) | ![preview 7](28/preview_7.png) | ![preview 8](28/preview_8.png) | | 29 | 32 | [Download](29/dataset.zip) | ![preview 1](29/preview_1.png) | ![preview 2](29/preview_2.png) | ![preview 3](29/preview_3.png) | ![preview 4](29/preview_4.png) | ![preview 5](29/preview_5.png) | ![preview 6](29/preview_6.png) | ![preview 7](29/preview_7.png) | ![preview 8](29/preview_8.png) | | 30 | 19 | [Download](30/dataset.zip) | ![preview 1](30/preview_1.png) | ![preview 2](30/preview_2.png) | ![preview 3](30/preview_3.png) | ![preview 4](30/preview_4.png) | ![preview 5](30/preview_5.png) | ![preview 6](30/preview_6.png) | ![preview 7](30/preview_7.png) | ![preview 8](30/preview_8.png) | | 31 | 15 | [Download](31/dataset.zip) | ![preview 1](31/preview_1.png) | ![preview 2](31/preview_2.png) | ![preview 3](31/preview_3.png) | ![preview 4](31/preview_4.png) | ![preview 5](31/preview_5.png) | ![preview 6](31/preview_6.png) | ![preview 7](31/preview_7.png) | ![preview 8](31/preview_8.png) | | 32 | 39 | [Download](32/dataset.zip) | ![preview 1](32/preview_1.png) | ![preview 2](32/preview_2.png) | ![preview 3](32/preview_3.png) | ![preview 4](32/preview_4.png) | ![preview 5](32/preview_5.png) | ![preview 6](32/preview_6.png) | ![preview 7](32/preview_7.png) | ![preview 8](32/preview_8.png) | | 33 | 23 | [Download](33/dataset.zip) | ![preview 1](33/preview_1.png) | ![preview 2](33/preview_2.png) | ![preview 3](33/preview_3.png) | ![preview 4](33/preview_4.png) | ![preview 5](33/preview_5.png) | ![preview 6](33/preview_6.png) | ![preview 7](33/preview_7.png) | ![preview 8](33/preview_8.png) | | 34 | 9 | [Download](34/dataset.zip) | ![preview 1](34/preview_1.png) | ![preview 2](34/preview_2.png) | ![preview 3](34/preview_3.png) | ![preview 4](34/preview_4.png) | ![preview 5](34/preview_5.png) | ![preview 6](34/preview_6.png) | ![preview 7](34/preview_7.png) | ![preview 8](34/preview_8.png) | | 35 | 14 | [Download](35/dataset.zip) | ![preview 1](35/preview_1.png) | ![preview 2](35/preview_2.png) | ![preview 3](35/preview_3.png) | ![preview 4](35/preview_4.png) | ![preview 5](35/preview_5.png) | ![preview 6](35/preview_6.png) | ![preview 7](35/preview_7.png) | ![preview 8](35/preview_8.png) | | 36 | 23 | [Download](36/dataset.zip) | ![preview 1](36/preview_1.png) | ![preview 2](36/preview_2.png) | ![preview 3](36/preview_3.png) | ![preview 4](36/preview_4.png) | ![preview 5](36/preview_5.png) | ![preview 6](36/preview_6.png) | ![preview 7](36/preview_7.png) | ![preview 8](36/preview_8.png) | | 37 | 11 | [Download](37/dataset.zip) | ![preview 1](37/preview_1.png) | ![preview 2](37/preview_2.png) | ![preview 3](37/preview_3.png) | ![preview 4](37/preview_4.png) | ![preview 5](37/preview_5.png) | ![preview 6](37/preview_6.png) | ![preview 7](37/preview_7.png) | ![preview 8](37/preview_8.png) | | 38 | 12 | [Download](38/dataset.zip) | ![preview 1](38/preview_1.png) | ![preview 2](38/preview_2.png) | ![preview 3](38/preview_3.png) | ![preview 4](38/preview_4.png) | ![preview 5](38/preview_5.png) | ![preview 6](38/preview_6.png) | ![preview 7](38/preview_7.png) | ![preview 8](38/preview_8.png) | | 39 | 13 | [Download](39/dataset.zip) | ![preview 1](39/preview_1.png) | ![preview 2](39/preview_2.png) | ![preview 3](39/preview_3.png) | ![preview 4](39/preview_4.png) | ![preview 5](39/preview_5.png) | ![preview 6](39/preview_6.png) | ![preview 7](39/preview_7.png) | ![preview 8](39/preview_8.png) | | 40 | 18 | [Download](40/dataset.zip) | ![preview 1](40/preview_1.png) | ![preview 2](40/preview_2.png) | ![preview 3](40/preview_3.png) | ![preview 4](40/preview_4.png) | ![preview 5](40/preview_5.png) | ![preview 6](40/preview_6.png) | ![preview 7](40/preview_7.png) | ![preview 8](40/preview_8.png) | | 41 | 40 | [Download](41/dataset.zip) | ![preview 1](41/preview_1.png) | ![preview 2](41/preview_2.png) | ![preview 3](41/preview_3.png) | ![preview 4](41/preview_4.png) | ![preview 5](41/preview_5.png) | ![preview 6](41/preview_6.png) | ![preview 7](41/preview_7.png) | ![preview 8](41/preview_8.png) | | 42 | 14 | [Download](42/dataset.zip) | ![preview 1](42/preview_1.png) | ![preview 2](42/preview_2.png) | ![preview 3](42/preview_3.png) | ![preview 4](42/preview_4.png) | ![preview 5](42/preview_5.png) | ![preview 6](42/preview_6.png) | ![preview 7](42/preview_7.png) | ![preview 8](42/preview_8.png) | | noise | 108 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
AndyLiu0104/Soldering-Data-Tiny-Complete-Sentence
--- dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 18013335.875 num_examples: 10481 download_size: 11576663 dataset_size: 18013335.875 --- # Dataset Card for "Soldering-Data-Tiny-Complete-Sentence" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Amanaccessassist/distileplay
--- dataset_info: features: - name: input dtype: string - name: label dtype: class_label: names: '0': Negative '1': Positive '2': Neutral splits: - name: train num_bytes: 32764330.300624426 num_examples: 368145 - name: test num_bytes: 14041906.699375574 num_examples: 157777 download_size: 26351556 dataset_size: 46806237.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
huuuyeah/MeetingBank_Audio
--- license: cc-by-nc-sa-4.0 --- ## Overview MeetingBank, a benchmark dataset created from the city councils of 6 major U.S. cities to supplement existing datasets. It contains 1,366 meetings with over 3,579 hours of video, as well as transcripts, PDF documents of meeting minutes, agenda, and other metadata. On average, a council meeting is 2.6 hours long and its transcript contains over 28k tokens, making it a valuable testbed for meeting summarizers and for extracting structure from meeting videos. The datasets contains 6,892 segment-level summarization instances for training and evaluating of performance. ## Acknowledgement Please cite the following paper in work that makes use of this dataset: [MeetingBank: A Benchmark Dataset for Meeting Summarization](https://arxiv.org/abs/2305.17529)\ Yebowen Hu, Tim Ganter, Hanieh Deilamsalehy, Franck Dernoncourt, Hassan Foroosh, Fei Liu\ In main conference of Association for Computational Linguistics (ACL'23), Toronto, Canada. ## Bibtex ``` @inproceedings{hu-etal-2023-meetingbank, title = "MeetingBank: A Benchmark Dataset for Meeting Summarization", author = "Yebowen Hu and Tim Ganter and Hanieh Deilamsalehy and Franck Dernoncourt and Hassan Foroosh and Fei Liu", booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (ACL)", month = July, year = "2023", address = "Toronto, Canada", publisher = "Association for Computational Linguistics", } ``` ## Resources MeetingBank dataset will be hosted at Zenodo. The audio files of each meeting will be hosted individually on Huggingface. All resources will includes meeting audio, transcripts, meetingbank main JSON file, summaries from 6 systems and human annotations. **Summary, Segments Transcripts and VideoList**: [zenodo](https://zenodo.org/record/7989108) **Meeting Audios**: [HuggingFace](https://huggingface.co/datasets/huuuyeah/MeetingBank) Some scripts can be found in github repo [MeetingBank_Utils](https://github.com/YebowenHu/MeetingBank-utils)
arize-ai/fashion_mnist_quality_drift
--- annotations_creators: - expert-generated language_creators: - expert-generated language: - en license: - mit multilinguality: - monolingual size_categories: - 10K<n<100K source_datasets: - extended|imdb task_categories: - image-classification task_ids: - multi-class-classification pretty_name: sentiment-classification-reviews-with-drift --- # Dataset Card for `reviews_with_drift` ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description ### Dataset Summary This dataset was crafted to be used in our tutorial [Link to the tutorial when ready]. It consists on a large Movie Review Dataset mixed with some reviews from a Hotel Review Dataset. The training/validation set are purely obtained from the Movie Review Dataset while the production set is mixed. Some other features have been added (`age`, `gender`, `context`) as well as a made up timestamp `prediction_ts` of when the inference took place. ### Supported Tasks and Leaderboards `text-classification`, `sentiment-classification`: The dataset is mainly used for text classification: given the text, predict the sentiment (positive or negative). ### Languages Text is mainly written in english. ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data [More Information Needed] #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations [More Information Needed] #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions Thanks to [@fjcasti1](https://github.com/fjcasti1) for adding this dataset.
CVasNLPExperiments/VQAv2_testdev_no_image_google_flan_t5_xl_mode_T_A_D_PNP_FILTER_C_Q_rices_ns_1000
--- dataset_info: features: - name: id dtype: int64 - name: question dtype: string - name: true_label sequence: string - name: prediction dtype: string splits: - name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random_ num_bytes: 94166 num_examples: 1000 download_size: 30018 dataset_size: 94166 --- # Dataset Card for "VQAv2_testdev_no_image_google_flan_t5_xl_mode_T_A_D_PNP_FILTER_C_Q_rices_ns_1000" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
frodoclicks/train
--- license: mit ---
WforGodot/add345_1
--- license: openrail ---
AdapterOcean/code_instructions_standardized_cluster_9_std
--- dataset_info: features: - name: message dtype: string - name: message_type dtype: string - name: message_id dtype: int64 - name: conversation_id dtype: int64 - name: cluster dtype: float64 - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 35793557 num_examples: 37842 download_size: 16026961 dataset_size: 35793557 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "code_instructions_standardized_cluster_9_std" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
passing2961/dialogcc
--- license: cc-by-nc-sa-4.0 language: - en pretty_name: DialogCC size_categories: - 1K<n<10K multilinguality: - monolingual annotation_creators: - machine-generated tags: - multi-modal dialogue source_datasets: - BlendedSkillTalk - DailyDialog - Persona-Chat - Wizard-of-Wikipedia - EmpatheticDialogues - CC3M task_ids: - conversational task_categories: - text-to-image - image-to-text splits: - name: train num_examples: 68402 - name: valid num_examples: 7644 - name: test num_examples: 7324 dataset_size: 83,370 --- # Dataset Card for DialogCC ## Dataset Description - **Repository:** [Code](https://github.com/passing2961/dialogcc) - **Paper:** [DialogCC: An Automated Pipeline for Creating High-Quality Multi-Modal Dialogue Dataset](https://arxiv.org/abs/2212.04119) - **Point of Contact:** [Young-Jun Lee](mailto:yj2961@kaist.ac.kr) ## Dataset Summary DialogCC is a publicly available high-quality and diverse multi-modal dialogue dataset that contains various images per dialogue and utterance, respectively. ## Languages English ## Dataset Structure field | type | description --- | --- | --- `dialogue_id` | str | the identifier for the dialogue, containing the original text-only dialogue type (e.g., bst) and index `dialogue` | list of dict | the dialogue where each dict entry includes {utterance_idx, utterance, speaker, rationale, shared_image, description} `split` | str | the split information: {train, valid, test} For the original text-only dialogue dataset, we have five types: "bst" (BlendedSkillTalk), "empathy" (EmpatheticDialogues), "daily" (DailyDialog), "wow" (Wizard-of-Wikipedia), and "persona" (Persona-Chat). In the "dialogue" field, the "shared_image" field is a list of dict. Each dict entry comprises two key pieces of information: "image_url" and "caption", both of which are sourced from the CC3M dataset. **Note:** We prompt GPT-4 to generate appropriate image-sharing moments within dialogues, including the utterance, the speaker, the rationale behind sharing, and a description of the image. Due to the nature of the generation process, GPT-4 may produce different descriptions, speakers, or rationales at the same image-sharing turn. Consequently, the same dialogue_id can appear across different instances within the dataset, representing these variations. ## Dataset Creation To create DialogCC, we propose a fully automatic framework for creating a multi-modal dialogue dataset that involves three main steps: (1) Collect source dialogue datasets (i.e., EmpatheticDialogues, Persona-Chat, DailyDialog, Wizard-of-Wikipedia, Blended Skill Talk) and source image-caption pair dataset (i.e., CC3M), (2) align most appropriate images to the dialogue by leveraging GPT-4 and CLIP, and (3) filter inappropriate images based on CLIP similarity for image-image consistency. For more details, please refer to our [paper](https://arxiv.org/abs/2212.04119). ### Further Details, Social Impacts, and Limitations Please refer to our [paper](https://arxiv.org/abs/2212.04119). ## Additional Information For a brief summary of our paper, please see this [project page](https://dialogcc.github.io). ## Limitations Please refer to the Limitation section in our [paper](https://arxiv.org/abs/2212.04119). ### Recommendations Despite our efforts to create a high-quality and diverse multi-modal dialogue dataset, it still contains harmful content, such as social bias. Moreover, since DialogCC incorporates dialogues from the DailyDialog dataset, which is licensed under CC BY-NC-SA 4.0, DialogCC is shared under the license CC-BY-NC-SA 4.0. Therefore, we strongly recommend using our dataset for academic and research purposes. ### Citation Please cite our work if you find the resources in this repository useful: ``` TBD ```
Mediform/sharegpt-french
--- dataset_info: features: - name: conversations list: - name: from dtype: string - name: value dtype: string splits: - name: train num_bytes: 32698024 num_examples: 5616 download_size: 14734838 dataset_size: 32698024 configs: - config_name: default data_files: - split: train path: data/train-* ---
zolak/twitter_dataset_50_1713146806
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 410994 num_examples: 1010 download_size: 220233 dataset_size: 410994 configs: - config_name: default data_files: - split: train path: data/train-* ---
zolak/twitter_dataset_78_1713211298
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 1521573 num_examples: 3760 download_size: 759876 dataset_size: 1521573 configs: - config_name: default data_files: - split: train path: data/train-* ---
mmathys/openai-moderation-api-evaluation
--- license: mit task_categories: - text-classification language: en pretty_name: OpenAI Moderation API Validation Dataset --- # Evaluation dataset for the paper "A Holistic Approach to Undesired Content Detection" The evaluation dataset `data/samples-1680.jsonl.gz` is the test set used in [this paper](https://arxiv.org/abs/2208.03274). Each line contains information about one sample in a JSON object and each sample is labeled according to our taxonomy. The category label is a binary flag, but if it does not include in the JSON, it means we do not know the label. | Category | Label | Definition | | -------- | ----- | ---------- | | sexual | `S` | Content meant to arouse sexual excitement, such as the description of sexual activity, or that promotes sexual services (excluding sex education and wellness). | | hate | `H` | Content that expresses, incites, or promotes hate based on race, gender, ethnicity, religion, nationality, sexual orientation, disability status, or caste. | | violence | `V` | Content that promotes or glorifies violence or celebrates the suffering or humiliation of others. | | harassment | `HR` | Content that may be used to torment or annoy individuals in real life, or make harassment more likely to occur. | | self-harm | `SH` | Content that promotes, encourages, or depicts acts of self-harm, such as suicide, cutting, and eating disorders. | | sexual/minors | `S3` | Sexual content that includes an individual who is under 18 years old. | | hate/threatening | `H2` | Hateful content that also includes violence or serious harm towards the targeted group. | | violence/graphic | `V2` | Violent content that depicts death, violence, or serious physical injury in extreme graphic detail. | Parsed from the GitHub repo: https://github.com/openai/moderation-api-release
MatsuoDochiai/Golinz
--- license: openrail ---
AdapterOcean/math_dataset_standardized_cluster_3
--- dataset_info: features: - name: text dtype: string - name: conversation_id dtype: int64 - name: embedding sequence: float64 - name: cluster dtype: int64 splits: - name: train num_bytes: 180711025 num_examples: 18655 download_size: 50145228 dataset_size: 180711025 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "math_dataset_standardized_cluster_3" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
chathuranga-jayanath/context-5-predict-token-for-fine-tune
--- dataset_info: features: - name: id dtype: int64 - name: filepath dtype: string - name: start_bug_line dtype: int64 - name: end_bug_line dtype: int64 - name: bug dtype: string - name: fix dtype: string - name: ctx dtype: string splits: - name: train num_bytes: 179586 num_examples: 305 - name: validation num_bytes: 21916 num_examples: 37 - name: test num_bytes: 22956 num_examples: 37 download_size: 60857 dataset_size: 224458 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* ---
RUCAIBox/Question-Generation
--- language: - en multilinguality: - monolingual task_categories: - text2text-generation task_ids: [] tags: - question-generation --- This is the question generation datasets collected by TextBox, including: - SQuAD (squadqg) - CoQA (coqaqg) - NewsQA (newsqa) - HotpotQA (hotpotqa) - MS MARCO (marco) - MSQG (msqg) - NarrativeQA (nqa) - QuAC (quac). The detail and leaderboard of each dataset can be found in [TextBox page](https://github.com/RUCAIBox/TextBox#dataset).
kpriyanshu256/MultiTabQA-multitable_pretraining-train-v2-46000
--- dataset_info: features: - name: tables sequence: string - name: table_names sequence: string - name: query dtype: string - name: answer dtype: string - name: source dtype: string - name: target dtype: string - name: source_latex dtype: string - name: target_latex dtype: string - name: source_html dtype: string - name: target_html dtype: string - name: source_markdown dtype: string - name: target_markdown dtype: string splits: - name: train num_bytes: 5137284719 num_examples: 1000 download_size: 1094992564 dataset_size: 5137284719 configs: - config_name: default data_files: - split: train path: data/train-* ---
open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v5.6
--- pretty_name: Evaluation run of bardsai/jaskier-7b-dpo-v5.6 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [bardsai/jaskier-7b-dpo-v5.6](https://huggingface.co/bardsai/jaskier-7b-dpo-v5.6)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v5.6\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-02-17T15:44:22.548008](https://huggingface.co/datasets/open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v5.6/blob/main/results_2024-02-17T15-44-22.548008.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6501937924638518,\n\ \ \"acc_stderr\": 0.032025249091365754,\n \"acc_norm\": 0.6494469200952948,\n\ \ \"acc_norm_stderr\": 0.03269529478578274,\n \"mc1\": 0.6303549571603427,\n\ \ \"mc1_stderr\": 0.01689818070697388,\n \"mc2\": 0.7781424860062839,\n\ \ \"mc2_stderr\": 0.013751565023330138\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.7090443686006825,\n \"acc_stderr\": 0.01327307786590759,\n\ \ \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.012968040686869147\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7137024497112129,\n\ \ \"acc_stderr\": 0.004511063351278702,\n \"acc_norm\": 0.8899621589324835,\n\ \ \"acc_norm_stderr\": 0.00312297363203947\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\ \ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\ \ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\ \ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\ \ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \ \ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\ \ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\ \ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\ \ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \ \ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\ : 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\ \ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\ \ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\ \ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\ \ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\ \ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\ \ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\ \ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\ \ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778394,\n \"\ acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778394\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\ \ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\ \ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\ \ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\ \ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\ \ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\ : 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\ \ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\ : 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\ \ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \ \ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\ \ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \ \ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066485,\n \ \ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066485\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \ \ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\ acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\ acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\ acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931055,\n \"\ acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931055\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \ \ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\ \ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\ \ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\ \ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\ acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\ \ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\ \ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\ \ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\ \ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\ \ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\ \ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\ \ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\ \ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\ \ \"acc_stderr\": 0.013625556907993469,\n \"acc_norm\": 0.8237547892720306,\n\ \ \"acc_norm_stderr\": 0.013625556907993469\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\ \ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4402234636871508,\n\ \ \"acc_stderr\": 0.016602564615049942,\n \"acc_norm\": 0.4402234636871508,\n\ \ \"acc_norm_stderr\": 0.016602564615049942\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\ \ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\ \ \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n\ \ \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n\ \ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \ \ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4758800521512386,\n\ \ \"acc_stderr\": 0.012755368722863937,\n \"acc_norm\": 0.4758800521512386,\n\ \ \"acc_norm_stderr\": 0.012755368722863937\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \ \ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \ \ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\ \ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\ \ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\ \ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\ \ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\ \ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \ \ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\ \ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\ \ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\ \ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6303549571603427,\n\ \ \"mc1_stderr\": 0.01689818070697388,\n \"mc2\": 0.7781424860062839,\n\ \ \"mc2_stderr\": 0.013751565023330138\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8453038674033149,\n \"acc_stderr\": 0.010163172650433535\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6967399545109931,\n \ \ \"acc_stderr\": 0.012661502663418697\n }\n}\n```" repo_url: https://huggingface.co/bardsai/jaskier-7b-dpo-v5.6 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|arc:challenge|25_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-02-17T15-44-22.548008.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|gsm8k|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hellaswag|10_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-17T15-44-22.548008.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-management|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-virology|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T15-44-22.548008.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|truthfulqa:mc|0_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-02-17T15-44-22.548008.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_02_17T15_44_22.548008 path: - '**/details_harness|winogrande|5_2024-02-17T15-44-22.548008.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-02-17T15-44-22.548008.parquet' - config_name: results data_files: - split: 2024_02_17T15_44_22.548008 path: - results_2024-02-17T15-44-22.548008.parquet - split: latest path: - results_2024-02-17T15-44-22.548008.parquet --- # Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v5.6 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [bardsai/jaskier-7b-dpo-v5.6](https://huggingface.co/bardsai/jaskier-7b-dpo-v5.6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v5.6", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-17T15:44:22.548008](https://huggingface.co/datasets/open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v5.6/blob/main/results_2024-02-17T15-44-22.548008.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6501937924638518, "acc_stderr": 0.032025249091365754, "acc_norm": 0.6494469200952948, "acc_norm_stderr": 0.03269529478578274, "mc1": 0.6303549571603427, "mc1_stderr": 0.01689818070697388, "mc2": 0.7781424860062839, "mc2_stderr": 0.013751565023330138 }, "harness|arc:challenge|25": { "acc": 0.7090443686006825, "acc_stderr": 0.01327307786590759, "acc_norm": 0.7303754266211604, "acc_norm_stderr": 0.012968040686869147 }, "harness|hellaswag|10": { "acc": 0.7137024497112129, "acc_stderr": 0.004511063351278702, "acc_norm": 0.8899621589324835, "acc_norm_stderr": 0.00312297363203947 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6943396226415094, "acc_stderr": 0.028353298073322663, "acc_norm": 0.6943396226415094, "acc_norm_stderr": 0.028353298073322663 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.28, "acc_stderr": 0.04512608598542126, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542126 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.048786087144669955, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.048786087144669955 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.046920083813689104, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.046920083813689104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41534391534391535, "acc_stderr": 0.025379524910778394, "acc_norm": 0.41534391534391535, "acc_norm_stderr": 0.025379524910778394 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.48412698412698413, "acc_stderr": 0.04469881854072606, "acc_norm": 0.48412698412698413, "acc_norm_stderr": 0.04469881854072606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.28, "acc_stderr": 0.04512608598542126, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542126 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7806451612903226, "acc_stderr": 0.023540799358723295, "acc_norm": 0.7806451612903226, "acc_norm_stderr": 0.023540799358723295 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.03317505930009182, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.03317505930009182 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.02860620428922987, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.02860620428922987 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9119170984455959, "acc_stderr": 0.02045374660160103, "acc_norm": 0.9119170984455959, "acc_norm_stderr": 0.02045374660160103 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.658974358974359, "acc_stderr": 0.02403548967633508, "acc_norm": 0.658974358974359, "acc_norm_stderr": 0.02403548967633508 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3148148148148148, "acc_stderr": 0.028317533496066485, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.028317533496066485 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6722689075630253, "acc_stderr": 0.03048991141767323, "acc_norm": 0.6722689075630253, "acc_norm_stderr": 0.03048991141767323 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.01563002297009244, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.01563002297009244 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5185185185185185, "acc_stderr": 0.03407632093854051, "acc_norm": 0.5185185185185185, "acc_norm_stderr": 0.03407632093854051 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8333333333333334, "acc_stderr": 0.026156867523931055, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.026156867523931055 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290916, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290916 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.034981493854624714, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.034981493854624714 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7603305785123967, "acc_stderr": 0.03896878985070416, "acc_norm": 0.7603305785123967, "acc_norm_stderr": 0.03896878985070416 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7914110429447853, "acc_stderr": 0.031921934489347235, "acc_norm": 0.7914110429447853, "acc_norm_stderr": 0.031921934489347235 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406964, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406964 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8237547892720306, "acc_stderr": 0.013625556907993469, "acc_norm": 0.8237547892720306, "acc_norm_stderr": 0.013625556907993469 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7312138728323699, "acc_stderr": 0.023868003262500104, "acc_norm": 0.7312138728323699, "acc_norm_stderr": 0.023868003262500104 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4402234636871508, "acc_stderr": 0.016602564615049942, "acc_norm": 0.4402234636871508, "acc_norm_stderr": 0.016602564615049942 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7189542483660131, "acc_stderr": 0.025738854797818733, "acc_norm": 0.7189542483660131, "acc_norm_stderr": 0.025738854797818733 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.02575586592263295, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.02575586592263295 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7314814814814815, "acc_stderr": 0.024659685185967284, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.024659685185967284 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4758800521512386, "acc_stderr": 0.012755368722863937, "acc_norm": 0.4758800521512386, "acc_norm_stderr": 0.012755368722863937 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6875, "acc_stderr": 0.02815637344037142, "acc_norm": 0.6875, "acc_norm_stderr": 0.02815637344037142 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6764705882352942, "acc_stderr": 0.018926082916083383, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.018926082916083383 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7428571428571429, "acc_stderr": 0.02797982353874455, "acc_norm": 0.7428571428571429, "acc_norm_stderr": 0.02797982353874455 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169136, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169136 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774709, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699122, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.6303549571603427, "mc1_stderr": 0.01689818070697388, "mc2": 0.7781424860062839, "mc2_stderr": 0.013751565023330138 }, "harness|winogrande|5": { "acc": 0.8453038674033149, "acc_stderr": 0.010163172650433535 }, "harness|gsm8k|5": { "acc": 0.6967399545109931, "acc_stderr": 0.012661502663418697 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
openlifescienceai/mmlu_clinical_knowledge
--- dataset_info: features: - name: subject_name dtype: string - name: data struct: - name: Correct Answer dtype: string - name: Correct Option dtype: string - name: Options struct: - name: A dtype: string - name: B dtype: string - name: C dtype: string - name: D dtype: string - name: Question dtype: string - name: id dtype: string splits: - name: test num_bytes: 88710 num_examples: 265 - name: validation num_bytes: 9355 num_examples: 29 - name: dev num_bytes: 1662 num_examples: 5 download_size: 89460 dataset_size: 99727 configs: - config_name: default data_files: - split: test path: data/test-* - split: validation path: data/validation-* - split: dev path: data/dev-* ---
xPXXX/test_ragas
--- license: mit ---
MPrabhu/textbookQA
--- dataset_info: features: - name: level dtype: string - name: question dtype: string splits: - name: train num_bytes: 143352 num_examples: 1000 download_size: 61476 dataset_size: 143352 configs: - config_name: default data_files: - split: train path: data/train-* ---
jhopela/dolly_train
--- license: openrail ---
Liareizz/MANUGAVASSI
--- license: openrail ---
CyberHarem/kisaragi_azurlane
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of kisaragi/如月/如月 (Azur Lane) This is the dataset of kisaragi/如月/如月 (Azur Lane), containing 433 images and their tags. The core tags of this character are `animal_ears, long_hair, pink_hair, cat_ears, ribbon, cat_tail, tail, hat, bow, pink_eyes, animal_ear_fluff, tail_ornament, school_hat, cat_girl, bangs, hair_between_eyes, yellow_headwear, one_side_up, very_long_hair, purple_eyes, ears_through_headwear, hair_ribbon, red_ribbon`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 433 | 447.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kisaragi_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 433 | 279.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kisaragi_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1032 | 615.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kisaragi_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 433 | 404.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kisaragi_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1032 | 834.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kisaragi_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/kisaragi_azurlane', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, blue_shirt, blush, kindergarten_uniform, looking_at_viewer, solo, long_sleeves, yellow_bowtie, parted_lips, simple_background, upper_body, :o, white_background, white_sailor_collar | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, jingle_bell, kindergarten_uniform, simple_background, solo, white_background, blue_shirt, tail_ribbon, white_thighhighs, yellow_skirt, looking_at_viewer, chibi | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, blue_shirt, blush, kindergarten_uniform, simple_background, solo, yellow_skirt, looking_at_viewer, white_background, own_hands_together, pleated_skirt | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, blue_shirt, blush, full_body, jingle_bell, kindergarten_uniform, long_sleeves, looking_at_viewer, parted_lips, pleated_skirt, red_bow, solo, tail_bell, tail_bow, white_sailor_collar, yellow_skirt, bowtie, chibi, red_eyes, twitter_username, white_thighhighs, yellow_bow, :o, own_hands_together, black_footwear, blue_background, standing_on_one_leg | | 4 | 20 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, black_headwear, black_skirt, hair_bow, retrofit_(azur_lane), short_sleeves, solo, suspender_skirt, white_shirt, beret, black_bow, blush, looking_at_viewer, two_side_up, white_pantyhose, collared_shirt, jingle_bell, pleated_skirt, dress_shirt, tail_bell, tail_bow, neck_ribbon, pink_bow, pink_ribbon, striped_bow, tail_ribbon, parted_lips, black_footwear, full_body, white_background, anchor, open_mouth, shoes | | 5 | 8 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, black_headwear, black_skirt, jingle_bell, short_sleeves, solo, suspender_skirt, tail_ribbon, white_shirt, two_side_up, white_pantyhose, simple_background, blush, hair_bow, retrofit_(azur_lane), white_background, school_uniform, legs, looking_at_viewer | | 6 | 8 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1boy, 1girl, blush, hetero, loli, open_mouth, flat_chest, nipples, penis, spread_legs, navel, sex, solo_focus, tears, vaginal, nude, thighhighs, bar_censor, blue_shirt, heart-shaped_pupils, kindergarten_uniform, missionary, moaning, mosaic_censoring, on_back, bell, cum_in_pussy, torso_grab | | 7 | 24 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, blush, hair_bow, jingle_bell, obi, solo, wide_sleeves, hair_flower, looking_at_viewer, white_pantyhose, pink_kimono, red_bow, long_sleeves, floral_print, short_kimono, frills, fur_collar, pink_skirt, print_kimono, parted_lips, dog, white_background, animal_on_head, pink_flower | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_shirt | blush | kindergarten_uniform | looking_at_viewer | solo | long_sleeves | yellow_bowtie | parted_lips | simple_background | upper_body | :o | white_background | white_sailor_collar | jingle_bell | tail_ribbon | white_thighhighs | yellow_skirt | chibi | own_hands_together | pleated_skirt | full_body | red_bow | tail_bell | tail_bow | bowtie | red_eyes | twitter_username | yellow_bow | black_footwear | blue_background | standing_on_one_leg | black_headwear | black_skirt | hair_bow | retrofit_(azur_lane) | short_sleeves | suspender_skirt | white_shirt | beret | black_bow | two_side_up | white_pantyhose | collared_shirt | dress_shirt | neck_ribbon | pink_bow | pink_ribbon | striped_bow | anchor | open_mouth | shoes | school_uniform | legs | 1boy | hetero | loli | flat_chest | nipples | penis | spread_legs | navel | sex | solo_focus | tears | vaginal | nude | thighhighs | bar_censor | heart-shaped_pupils | missionary | moaning | mosaic_censoring | on_back | bell | cum_in_pussy | torso_grab | obi | wide_sleeves | hair_flower | pink_kimono | floral_print | short_kimono | frills | fur_collar | pink_skirt | print_kimono | dog | animal_on_head | pink_flower | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:--------|:-----------------------|:--------------------|:-------|:---------------|:----------------|:--------------|:--------------------|:-------------|:-----|:-------------------|:----------------------|:--------------|:--------------|:-------------------|:---------------|:--------|:---------------------|:----------------|:------------|:----------|:------------|:-----------|:---------|:-----------|:-------------------|:-------------|:-----------------|:------------------|:----------------------|:-----------------|:--------------|:-----------|:-----------------------|:----------------|:------------------|:--------------|:--------|:------------|:--------------|:------------------|:-----------------|:--------------|:--------------|:-----------|:--------------|:--------------|:---------|:-------------|:--------|:-----------------|:-------|:-------|:---------|:-------|:-------------|:----------|:--------|:--------------|:--------|:------|:-------------|:--------|:----------|:-------|:-------------|:-------------|:----------------------|:-------------|:----------|:-------------------|:----------|:-------|:---------------|:-------------|:------|:---------------|:--------------|:--------------|:---------------|:---------------|:---------|:-------------|:-------------|:---------------|:------|:-----------------|:--------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | X | X | | | | X | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | X | | | | X | | | X | | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | X | X | X | | X | | | X | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 20 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | | X | X | | | X | | | | X | | X | X | | | | | X | X | | X | X | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 8 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | | X | X | | | | X | | | X | | X | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | X | X | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 8 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | 7 | 24 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | X | | X | X | X | | X | | | | X | | X | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
distilled-one-sec-cv12-each-chunk-uniq/chunk_88
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 1337055356.0 num_examples: 260533 download_size: 1369319733 dataset_size: 1337055356.0 --- # Dataset Card for "chunk_88" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jiandong/crimson-cve-mapping
--- license: unknown dataset_info: features: - name: cve_id dtype: string - name: cve_mapping struct: - name: cve_id dtype: string - name: explaination dtype: string - name: exploit_techniques list: - name: exploit_method dtype: string - name: exploitation_technique struct: - name: id dtype: string - name: name dtype: string - name: functionality struct: - name: gained_functionality sequence: string - name: primary_impact list: - name: id dtype: string - name: name dtype: string - name: secondary_impact list: - name: id dtype: string - name: name dtype: string - name: vulnerability_type struct: - name: exploitation_techniques list: - name: id dtype: string - name: name dtype: string - name: primary_impact list: - name: id dtype: string - name: name dtype: string - name: secondary_impact list: - name: id dtype: string - name: name dtype: string - name: type dtype: string - name: description dtype: string - name: cvss_version dtype: string - name: cvss_severity dtype: string - name: cvss_base_score dtype: float64 - name: related_attcks list: - name: id dtype: string - name: name dtype: string - name: gt_attcks list: - name: id dtype: string - name: name dtype: string - name: attck_patterns list: - name: brief struct: - name: long_text dtype: string - name: short_text dtype: string - name: description dtype: string - name: metadata struct: - name: external_id dtype: string - name: id dtype: string - name: kill_chain_phases list: - name: kill_chain_name dtype: string - name: phase_name dtype: string - name: name dtype: string - name: x_mitre_domains sequence: string - name: x_mitre_is_subtechnique dtype: bool - name: x_mitre_platforms sequence: string - name: x_mitre_tactic_type sequence: string - name: relationships list: - name: id dtype: string - name: relationship_type dtype: string - name: source_ref dtype: string - name: target_ref dtype: string - name: related_attck_patterns list: - name: brief struct: - name: long_text dtype: string - name: short_text dtype: string - name: description dtype: string - name: metadata struct: - name: external_id dtype: string - name: id dtype: string - name: kill_chain_phases list: - name: kill_chain_name dtype: string - name: phase_name dtype: string - name: name dtype: string - name: x_mitre_domains sequence: string - name: x_mitre_is_subtechnique dtype: bool - name: x_mitre_platforms sequence: string - name: x_mitre_tactic_type sequence: string - name: relationships list: - name: id dtype: string - name: relationship_type dtype: string - name: source_ref dtype: string - name: target_ref dtype: string - name: count_attck_patterns dtype: int64 - name: count_related_attck_patterns dtype: int64 splits: - name: train num_bytes: 31954455.207373273 num_examples: 1215 - name: test num_bytes: 7995188.792626728 num_examples: 304 download_size: 2567859 dataset_size: 39949644.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
SKSowe/Models_Downloads_and_Likes_Metrics
--- license: apache-2.0 language: - ff - wo pretty_name: DBIS --- # Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
Writo/realestate_data
--- license: mit ---
KAUE2006/PatatiPauloTeixeira
--- license: openrail ---
autoevaluate/autoeval-staging-eval-project-adversarial_qa-1cd241d3-12195624
--- type: predictions tags: - autotrain - evaluation datasets: - adversarial_qa eval_info: task: extractive_question_answering model: deepset/roberta-large-squad2 metrics: [] dataset_name: adversarial_qa dataset_config: adversarialQA dataset_split: validation col_mapping: context: context question: question answers-text: answers.text answers-answer_start: answers.answer_start --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Question Answering * Model: deepset/roberta-large-squad2 * Dataset: adversarial_qa * Config: adversarialQA * Split: validation To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@ceyda](https://huggingface.co/ceyda) for evaluating this model.
multimodalart/lora-ease-helper
--- license: mit ---
pfcheng123/test2
--- license: other ---
Hack90/ncbi_genbank_part_23
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: id dtype: string - name: sequence dtype: string - name: name dtype: string - name: description dtype: string - name: features dtype: int64 - name: seq_length dtype: int64 splits: - name: train num_bytes: 32376257548 num_examples: 660870 download_size: 14556362694 dataset_size: 32376257548 --- # Dataset Card for "ncbi_genbank_part_23" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
harshinde/Covid-19_country_vaccinations
--- license: mit language: - en ---
Alphonsce/Fef_dataset
--- dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 5469372.0 num_examples: 21 download_size: 5470869 dataset_size: 5469372.0 configs: - config_name: default data_files: - split: train path: data/train-* --- # That's photos of my dog :)
Copninich/PongSawaML
--- license: apache-2.0 language: - th tags: - history pretty_name: s task_categories: - summarization size_categories: - 1K<n<10K --- # PongSawaML PongsawaM(L) is a Thai annal Ayutthaya, Thonburi, and Rattakosin (Rama I). This annal writed by Dan Beach Bradley an American Protestant missionary
davanstrien/raw-tldr-dataset
--- dataset_info: features: - name: datasetId dtype: string - name: author dtype: string - name: last_modified dtype: timestamp[us, tz=UTC] - name: downloads dtype: int64 - name: likes dtype: int64 - name: tags sequence: string - name: task_categories sequence: string - name: createdAt dtype: timestamp[us, tz=UTC] - name: card dtype: string - name: parsed_card dtype: string - name: length dtype: int64 - name: input dtype: string - name: generation_model sequence: string - name: generation_prompt sequence: string - name: raw_generation_responses sequence: string - name: generations sequence: string splits: - name: train num_bytes: 19291771 num_examples: 500 download_size: 7858010 dataset_size: 19291771 configs: - config_name: default data_files: - split: train path: data/train-* ---
liuyanchen1015/MULTI_VALUE_qqp_clause_final_though_but
--- dataset_info: features: - name: question1 dtype: string - name: question2 dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: dev num_bytes: 914 num_examples: 5 - name: test num_bytes: 10119 num_examples: 42 - name: train num_bytes: 9114 num_examples: 37 download_size: 23042 dataset_size: 20147 --- # Dataset Card for "MULTI_VALUE_qqp_clause_final_though_but" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
iwasjohnlennon/JayAraeEssexArchive
--- task_categories: - text-classification language: - en tags: - medical - music - biology - chemistry - art - climate size_categories: - 100K<n<1M --- 300-400 hours of video turned into text + his twitter tweets using whisper large-v2 model for ai transcription. Jay Essex has made 3 books and about 1200 videos which only about 800 can be found online unless someone has a back up storage for past youtube videos as his old youtube channel got removed before it was fully backed up by a fan. This goes indepth on a variety of topics. Alot of it has never been shared before here. Some include DNA ICUC (Evolution), Source energy, aliens, extraterrestrials, spiritual awakening, psychic development, creations history, earths history, creations future, earths future, who was god, talk about angels, what types of spirit/souls are there, how to awaken metaphysically, tools that help the psychic abilitys develop, talk about aliens like the annunaki and more, facts about dragons and unicorns spirit, crystals and stones, divination tools, spirit guides, and a whole lot more. For list of problems with this data and how it was made go here https://www.youtube.com/watch?v=TBUDd3EVX6A Here are even more topics he covers, although some topics might only be found in his books and atm i havent included the books with this dataset. The New Universal Alliance, Drachk, N'Antids, Solar System, Alliance of Planets, Arae, Lilly, source field, akashic records, energy healing, star essenite, earthquake, tectonic plate splits, abuse system, freedom, trump, joe biden, government, military, et, alien hybrid, space travel, time travel, universe, law of attraction (myth), metaphysical, self awareness, energy flowing, flow state, relaxation tips, guided meditations, religions, jesus, qeeg test results, numerology, spirit core, angels, dreams, stone energy, Dreams, Visions, Deja-Vu, Spirit Guides (w/Ear Ringing), Ghost, Demons, Exorcisms, Energetic Imprint Recordings, Dousing Rods, Pendulums, Kinesiology, Pictures, Dimensions, Barriers, Mirrors, Ouija Boards, Darting Black Spots in the Corners of Your Eyes, Sage, Spontaneous Combustion, spirit attack, spirit protection, flow within to flow without outwards, the spiritual foundation, thespiritualfoundation, ghandi reincarnated, reincarnation, past lifes, third eye, pineal gland, nervous system, george washington, Tomoe Gozen, Johann Sebastian Bach, color therapy, android, cyborg, telekinesis, kundlini awakening, gaia
BigFatDoughtnut/henryk-sienkiewicz-w-pustyni-i-w-puszczy
--- license: mit ---
male-2/evaluation_align_v1__store_baehanjin_work_ml-training_v3_merged-public
--- dataset_info: features: - name: Aspect dtype: string - name: Sub-Aspect dtype: string - name: Query dtype: string - name: type dtype: string - name: Dialogue dtype: string - name: Response dtype: string splits: - name: train num_bytes: 475 num_examples: 1 download_size: 5647 dataset_size: 475 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "evaluation_align_v1__store_baehanjin_work_ml-training_v3_merged-public" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
avitri/eng-fra
--- license: mit ---
classla/ParlaSpeech-HR
--- dataset_info: features: - name: id dtype: string - name: audio dtype: audio: sampling_rate: 16000 - name: text dtype: string - name: text_normalised dtype: string - name: words list: - name: char_e dtype: int64 - name: char_s dtype: int64 - name: time_e dtype: float64 - name: time_s dtype: float64 - name: audio_length dtype: float64 - name: date dtype: string - name: speaker_name dtype: string - name: speaker_gender dtype: string - name: speaker_birth dtype: string - name: speaker_party dtype: string - name: party_orientation dtype: string - name: party_status dtype: string splits: - name: train num_bytes: 162874686121.866 num_examples: 867581 download_size: 179092718936 dataset_size: 162874686121.866 configs: - config_name: default data_files: - split: train path: data/train-* --- # The Croatian Parliamentary Spoken Dataset ParlaSpeech-HR 2.0 http://hdl.handle.net/11356/1914 The ParlaSpeech-HR dataset is built from the transcripts of parliamentary proceedings available in the Croatian part of the ParlaMint corpus, and the parliamentary recordings available from the Croatian Parliament's YouTube channel. The corpus consists of audio segments that correspond to specific sentences in the transcripts. The transcript contains word-level alignments to the recordings, each instance consisting of character and millisecond start and end offsets, allowing for simple further segmentation of long sentences into shorter segments for ASR and other memory-sensitive applications. Sequences longer than 30 seconds have already been removed from this dataset, which should allow for a simple usage on most modern GPUs. Each segment has an identifier reference to the ParlaMint 4.0 corpus (http://hdl.handle.net/11356/1859) via the utterance ID and character offsets. While in the original dataset all the speaker information from the ParlaMint corpus is available via the `speaker_info` attribute, in the HuggingFace version only a subset of metadata is available, namely: the date, the name of the speaker, their gender, year of birth, party affiliation at that point in time, status of the party at that point in time (coalition or opposition), and party orientation (left, right, centre etc.). Different to the original dataset, this version has also a `text_normalised` attribute, which contains the text with parliamentary comments (`[[Applause]]` and similar) removed. If you use the dataset, please cite the following paper: ``` @inproceedings{ljubesic-etal-2022-parlaspeech, title = "{P}arla{S}peech-{HR} - a Freely Available {ASR} Dataset for {C}roatian Bootstrapped from the {P}arla{M}int Corpus", author = "Ljube{\v{s}}i{\'c}, Nikola and Kor{\v{z}}inek, Danijel and Rupnik, Peter and Jazbec, Ivo-Pavao", editor = "Fi{\v{s}}er, Darja and Eskevich, Maria and Lenardi{\v{c}}, Jakob and de Jong, Franciska", booktitle = "Proceedings of the Workshop ParlaCLARIN III within the 13th Language Resources and Evaluation Conference", month = jun, year = "2022", address = "Marseille, France", publisher = "European Language Resources Association", url = "https://aclanthology.org/2022.parlaclarin-1.16", pages = "111--116", } ```
AdapterOcean/med_alpaca_standardized_cluster_7_std
--- dataset_info: features: - name: message dtype: string - name: message_type dtype: string - name: message_id dtype: int64 - name: conversation_id dtype: int64 - name: cluster dtype: float64 - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 13037706 num_examples: 22584 download_size: 6735945 dataset_size: 13037706 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "med_alpaca_standardized_cluster_7_std" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
augtoma/medmcqa
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: id dtype: string - name: question dtype: string - name: cop dtype: class_label: names: '0': a '1': b '2': c '3': d - name: choice_type dtype: string - name: exp dtype: string - name: subject_name dtype: string - name: topic_name dtype: string - name: options struct: - name: A dtype: string - name: B dtype: string - name: C dtype: string - name: D dtype: string - name: answer_idx dtype: string - name: answer dtype: string splits: - name: train num_bytes: 136988451 num_examples: 182822 - name: test num_bytes: 2350095 num_examples: 4183 download_size: 90978864 dataset_size: 139338546 --- # Dataset Card for "medmcqa" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Bepitic/DM-Description-Action
--- license: gpl ---
YaTharThShaRma999/autotrain-data-flant5finetune
--- task_categories: - summarization --- # AutoTrain Dataset for project: flant5finetune ## Dataset Description This dataset has been automatically processed by AutoTrain for project flant5finetune. ### Languages The BCP-47 code for the dataset's language is unk. ## Dataset Structure ### Data Instances A sample from this dataset looks as follows: ```json [ { "text": "Describe this image of a serene lake and turn it into a starry night scene.", "target": "Tools: Image captioning, Image editing\nInput: {Image}, turn it into a starry night scene\n" }, { "text": "Generate a melodic rock track for an energetic outdoor adventure video.", "target": "Tools: Music generation\nInput: melodic rock track for an energetic outdoor adventure video\n" } ] ``` ### Dataset Fields The dataset has the following fields (also called "features"): ```json { "text": "Value(dtype='string', id=None)", "target": "Value(dtype='string', id=None)" } ``` ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow: | Split name | Num samples | | ------------ | ------------------- | | train | 23 | | valid | 6 |
tyzhu/lmind_hotpot_train5000_eval5000_v1_ic_qa
--- configs: - config_name: default data_files: - split: train_qa path: data/train_qa-* - split: train_recite_qa path: data/train_recite_qa-* - split: train_ic_qa path: data/train_ic_qa-* - split: eval_qa path: data/eval_qa-* - split: eval_recite_qa path: data/eval_recite_qa-* - split: eval_ic_qa path: data/eval_ic_qa-* - split: all_docs path: data/all_docs-* - split: all_docs_eval path: data/all_docs_eval-* - split: train path: data/train-* - split: validation path: data/validation-* dataset_info: features: - name: inputs dtype: string - name: targets dtype: string - name: answers struct: - name: answer_start sequence: 'null' - name: text sequence: string splits: - name: train_qa num_bytes: 864508 num_examples: 5000 - name: train_recite_qa num_bytes: 5350190 num_examples: 5000 - name: train_ic_qa num_bytes: 5345190 num_examples: 5000 - name: eval_qa num_bytes: 813536 num_examples: 5000 - name: eval_recite_qa num_bytes: 5394796 num_examples: 5000 - name: eval_ic_qa num_bytes: 5345190 num_examples: 5000 - name: all_docs num_bytes: 8524332 num_examples: 18224 - name: all_docs_eval num_bytes: 8523131 num_examples: 18224 - name: train num_bytes: 5345190 num_examples: 5000 - name: validation num_bytes: 5345190 num_examples: 5000 download_size: 30643792 dataset_size: 50851253 --- # Dataset Card for "lmind_hotpot_train5000_eval5000_v1_ic_qa" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
distilled-one-sec-cv12-each-chunk-uniq/chunk_10
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 1037767380.0 num_examples: 202215 download_size: 1059033634 dataset_size: 1037767380.0 --- # Dataset Card for "chunk_10" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CyberHarem/talulah_arknights
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of Talulah/タルラ/塔露拉 (Arknights) This is the dataset of Talulah/タルラ/塔露拉 (Arknights), containing 218 images and their tags. The core tags of this character are `horns, grey_hair, dragon_horns, hair_intakes, short_hair, breasts, dragon_girl, hair_between_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 218 | 392.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/talulah_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 1200 | 218 | 326.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/talulah_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 544 | 628.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/talulah_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/talulah_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, long_sleeves, black_ascot, looking_at_viewer, black_dress, upper_body, frills, white_shirt, armband, closed_mouth, simple_background, white_background, puffy_sleeves, medium_breasts | | 1 | 11 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_dress, long_sleeves, solo, dragon_tail, looking_at_viewer, standing, holding_sword, fire, feet_out_of_frame, sky, ascot, black_footwear, outdoors, skirt, armband, building, full_body, high_heel_boots, pantyhose, shirt | | 2 | 9 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | black_dress, deer_antlers, deer_ears, deer_girl, long_sleeves, shirt, long_hair, smile, 2girls, closed_eyes, open_mouth, simple_background, solo_focus, white_background, epaulettes, pinafore_dress, tail, 1girl, black_neckerchief, holding | | 3 | 14 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | looking_at_viewer, bare_shoulders, 1girl, dragon_tail, navel, outdoors, solo, day, stomach, thighs, black_bikini, blue_sky, medium_breasts, sunglasses, water, beach, long_hair, sitting, cleavage, cloud, grey_eyes, open_mouth, smile, alternate_costume, bare_arms, bare_legs, eyewear_on_head, large_breasts, ocean | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | long_sleeves | black_ascot | looking_at_viewer | black_dress | upper_body | frills | white_shirt | armband | closed_mouth | simple_background | white_background | puffy_sleeves | medium_breasts | dragon_tail | standing | holding_sword | fire | feet_out_of_frame | sky | ascot | black_footwear | outdoors | skirt | building | full_body | high_heel_boots | pantyhose | shirt | deer_antlers | deer_ears | deer_girl | long_hair | smile | 2girls | closed_eyes | open_mouth | solo_focus | epaulettes | pinafore_dress | tail | black_neckerchief | holding | bare_shoulders | navel | day | stomach | thighs | black_bikini | blue_sky | sunglasses | water | beach | sitting | cleavage | cloud | grey_eyes | alternate_costume | bare_arms | bare_legs | eyewear_on_head | large_breasts | ocean | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:--------------|:--------------------|:--------------|:-------------|:---------|:--------------|:----------|:---------------|:--------------------|:-------------------|:----------------|:-----------------|:--------------|:-----------|:----------------|:-------|:--------------------|:------|:--------|:-----------------|:-----------|:--------|:-----------|:------------|:------------------|:------------|:--------|:---------------|:------------|:------------|:------------|:--------|:---------|:--------------|:-------------|:-------------|:-------------|:-----------------|:-------|:--------------------|:----------|:-----------------|:--------|:------|:----------|:---------|:---------------|:-----------|:-------------|:--------|:--------|:----------|:-----------|:--------|:------------|:--------------------|:------------|:------------|:------------------|:----------------|:--------| | 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 11 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | X | X | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 9 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | | | X | | | | | | X | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | 3 | 14 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | | X | | | | | | | | | | X | X | | | | | | | | X | | | | | | | | | | X | X | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
jxm/mpqa
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* - split: dev path: data/dev-* dataset_info: features: - name: sentence dtype: string - name: label dtype: int64 splits: - name: train num_bytes: 263258 num_examples: 8603 - name: test num_bytes: 62502 num_examples: 2000 - name: dev num_bytes: 7835 num_examples: 256 download_size: 0 dataset_size: 333595 --- # Dataset Card for "mpqa" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
louisbrulenaudet/cgi
--- license: apache-2.0 language: - fr multilinguality: - monolingual tags: - finetuning - legal - tax - llm - fiscal - cgi - Code Général des Impôts source_datasets: - original pretty_name: Code Général des Impôts (CGI) task_categories: - text-generation - table-question-answering - summarization - conversational size_categories: - 1K<n<10K --- # Code Général des Impôts, non-instruct (11-12-2023) This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for tax practice. Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach. Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks. Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways: - Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions. - Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs. - Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more. - Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs. - Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text. ## Dataset generation This JSON file is a list of dictionaries, each dictionary contains the following fields: - `instruction`: `string`, presenting the instruction linked to the element. - `input`: `string`, signifying the input details for the element. - `output`: `string`, indicating the output information for the element. We used the following list of instructions for generating the dataset: ```python instructions = [ "Compose l'intégralité de l'article sous forme écrite.", "Écris la totalité du contenu de l'article.", "Formule la totalité du texte présent dans l'article.", "Produis l'intégralité de l'article en écriture.", "Développe l'article dans son ensemble par écrit.", "Génère l'ensemble du texte contenu dans l'article.", "Formule le contenu intégral de l'article en entier.", "Rédige la totalité du texte de l'article en entier.", "Compose l'intégralité du contenu textuel de l'article.", "Rédige l'ensemble du texte qui constitue l'article.", "Formule l'article entier dans son contenu écrit.", "Composez l'intégralité de l'article sous forme écrite.", "Écrivez la totalité du contenu de l'article.", "Formulez la totalité du texte présent dans l'article.", "Développez l'article dans son ensemble par écrit.", "Générez l'ensemble du texte contenu dans l'article.", "Formulez le contenu intégral de l'article en entier.", "Rédigez la totalité du texte de l'article en entier.", "Composez l'intégralité du contenu textuel de l'article.", "Écrivez l'article dans son intégralité en termes de texte.", "Rédigez l'ensemble du texte qui constitue l'article.", "Formulez l'article entier dans son contenu écrit.", "Composer l'intégralité de l'article sous forme écrite.", "Écrire la totalité du contenu de l'article.", "Formuler la totalité du texte présent dans l'article.", "Produire l'intégralité de l'article en écriture.", "Développer l'article dans son ensemble par écrit.", "Générer l'ensemble du texte contenu dans l'article.", "Formuler le contenu intégral de l'article en entier.", "Rédiger la totalité du texte de l'article en entier.", "Composer l'intégralité du contenu textuel de l'article.", "Rédiger l'ensemble du texte qui constitue l'article.", "Formuler l'article entier dans son contenu écrit.", "Quelles sont les dispositions de l'article ?", "Quelles dispositions sont incluses dans l'article ?", "Quelles sont les dispositions énoncées dans l'article ?", "Quel est le texte intégral de l'article ?", "Quelle est la lettre de l'article ?" ] ``` ## Citing this project If you use this code in your research, please use the following BibTeX entry. ```BibTeX @misc{louisbrulenaudet2023, author = {Louis Brulé Naudet}, title = {Code Général des Impôts, non-instruct (11-12-2023)}, howpublished = {\url{https://huggingface.co/datasets/louisbrulenaudet/cgi}}, year = {2023} } ``` ## Feedback If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com).
autoevaluate/autoeval-eval-acronym_identification-default-ef327c-37654145039
--- type: predictions tags: - autotrain - evaluation datasets: - acronym_identification eval_info: task: entity_extraction model: lewtun/autotrain-acronym-identification-7324788 metrics: ['angelina-wang/directional_bias_amplification'] dataset_name: acronym_identification dataset_config: default dataset_split: train col_mapping: tokens: tokens tags: labels --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Token Classification * Model: lewtun/autotrain-acronym-identification-7324788 * Dataset: acronym_identification * Config: default * Split: train To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@qingxuwenli](https://huggingface.co/qingxuwenli) for evaluating this model.
marcelhuber/downprojection_images
--- task_categories: - image-classification language: - en tags: - medical pretty_name: downprojection_images size_categories: - 1K<n<10K --- # Dataset Structure This dataset contains images categorized into different classes for medical image analysis. The dataset is organized as follows: - `CNV`: Contains 250 images and 250 w-vector files. - `DME`: Contains 250 images and 250 w-vector files. - `DRUSEN`: Contains 250 images and 250 w-vector files. - `NORMAL`: Contains 250 images and 250 w-vector files. ## Usage This dataset can be used for tasks such as classification, image recognition, and medical analysis. The provided class subdirectories indicate the different categories for the images. The purpose of this dataset is to be used in an interactive dash-map where the latent codes of these images have been downprojected using either PCA or t-SNE. Each individual point in the map corresponds to an original image, which is displayed when hovered over. Here is the reposity: https://github.com/marceljhuber/Dash-Downprojection-Viewer
DTU54DL/common-voice-test3k
--- annotations_creators: - expert-generated language: - en language_creators: - found license: - mit multilinguality: - monolingual paperswithcode_id: acronym-identification pretty_name: Acronym Identification Dataset size_categories: - 10K<n<100K source_datasets: - original task_categories: - token-classification task_ids: - token-classification-other-acronym-identification train-eval-index: - col_mapping: labels: tags tokens: tokens config: default splits: eval_split: test task: token-classification task_id: entity_extraction --- # Dataset Card for [Dataset Name] ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** - **Repository:** - **Paper:** - **Leaderboard:** - **Point of Contact:** ### Dataset Summary [More Information Needed] ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset.
DeliberatorArchiver/gi_cutscn_new
--- license: cc-by-nc-nd-4.0 language: - zh - en - ja - ko viewer: false --- # gi_cutscn_new This repository contains cut-scene video files from a certain anime game (a.k.a. GI). ![Release Channel](https://img.shields.io/badge/dynamic/json?url=https%3A%2F%2Fhuggingface.co%2Fdatasets%2FDeliberatorArchiver%2Fgi_cutscn_new%2Fresolve%2Fmain%2Fversion.json&query=%24.channel&label=Channel) ![Release Version](https://img.shields.io/badge/dynamic/json?url=https%3A%2F%2Fhuggingface.co%2Fdatasets%2FDeliberatorArchiver%2Fgi_cutscn_new%2Fresolve%2Fmain%2Fversion.json&query=%24.version&label=Version) ![Official Version](https://img.shields.io/badge/dynamic/json?url=https%3A%2F%2Fis.gd%2FQ2DQ02&query=%24.data.game.latest.version&label=Official%20Version) ## Disclaimer **This resource is released for educational or research purposes only. Copyrights and other rights to this resource belong to their respective copyright holders.** ## About All cut-scene video files are encoded using HLS streaming technology. ### Details The original files were extracted directly from the game. The original file contains one video track and four audio tracks (Chinese, English, Japanese, and Korean). The video track is encoded in VP9 and the audio track is encoded in CRI HCA. All video and audio tracks are packed in CRI USM. ### Details of encoded video files See [Gist](https://gist.github.com/daydreamer-json/8e0f2bf2025db209a9727ad4f2dd983a) for the parameters used during encoding. |Type|Level|Audio|Resolution|Codec|Color|Max Bitrate|Avg Bitrate| |---|---|---|---|---|---|---|---| |Video|L-5|L-2|1920 x 1080|VP9|YUV 4:2:0 8bit BT.601|Copy|Copy| |Video|L-4|L-1|1920 x 1080|HEVC|YUV 4:2:0 8bit BT.709|12800 kbps|9600 kbps| |Video|L-3|L-1|1280 x 720|HEVC|YUV 4:2:0 8bit BT.709|6400 kbps|4800 kbps| |Video|L-2|L-0|854 x 480|HEVC|YUV 4:2:0 8bit BT.709|3600 kbps|2400 kbps| |Video|L-1|L-0|640 x 360|HEVC|YUV 4:2:0 8bit BT.709|2400 kbps|1600 kbps| |Video|L-0|L-0|426 x 240|HEVC|YUV 4:2:0 8bit BT.709|1600 kbps|800 kbps| |Type|Level|Resolution|Codec|Max Bitrate|Avg Bitrate| |---|---|---|---|---|---| |Audio|L-2|16bit 48kHz 2ch|FLAC|1536 kbps|Lossless| |Audio|L-1|16bit 48kHz 2ch|AAC|? kbps|256 kbps| |Audio|L-0|16bit 48kHz 2ch|AAC|? kbps|128 kbps| ## How to watch To watch HLS (HTTP Live Streaming) media manifest files, you will need a player that supports HLS (MPEG-TS and fMP4) demux and can decode various video and audio formats. ### For iOS / iPadOS / macOS: You can open it from the **"Vidstack Player"** link. ### For Android / Windows / Linux: The best thing to do is to use a player that can play HLS, such as mpv player, mpv.net, or VLC. Use the **"Raw"** link to watch. Another way is to use hls.js to play HLS media files in browsers that support the Media Source Extensions API, such as Chrome, Firefox, and Edge. You can open it from the **"Vidstack Player"** link. ## Cut-scene Video Links ### See [Links page](lists.md)
Lostkyd/InstrucDataPDF
--- dataset_info: features: - name: Instruction dtype: string - name: Input dtype: string - name: Output dtype: string splits: - name: train num_bytes: 335269 num_examples: 87 download_size: 74238 dataset_size: 335269 configs: - config_name: default data_files: - split: train path: data/train-* ---
bigcode/santacoder-token-usage
--- dataset_info: features: - name: token dtype: int64 - name: Java dtype: int64 - name: JavaScript dtype: int64 - name: Python dtype: int64 splits: - name: train num_bytes: 1571808 num_examples: 49119 download_size: 1165252 dataset_size: 1571808 --- # Dataset Card for "santacoder-token-usage" Token usage count per language when tokenizing the `"bigcode/stack-dedup-alt-comments"` dataset with the `santacoder` tokenizer. There are less tokens than in the tokenizer because of vocabulary mismatch between the datasets used to train the tokenizer and the ones that ended up being used to train the model. [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
JovialValley/broadclass_totalMapped3
--- dataset_info: features: - name: input_values sequence: float32 - name: labels sequence: int64 splits: - name: train num_bytes: 109539072 num_examples: 390 - name: test num_bytes: 27914744 num_examples: 97 download_size: 138277700 dataset_size: 137453816 --- # Dataset Card for "broadclass_totalMapped3" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mlenjoyneer/RuTextSegWiki
--- annotations_creators: - machine-generated language_creators: - found language: - ru size_categories: - 10K<n<100K license: - unknown multilinguality: - monolingual source_datasets: - original --- # Dataset Card ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Additional Information](#additional-information) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) ## Dataset Description ### Dataset Summary Dataset for automatic text segmentation of Russian wiki. Text corpora based on May 2023 Wikipedia dump. Markup was generated automatically based on 2 methods: taking texts with ready division into paragraphs and random joining parts of different texts. ### Supported Tasks and Leaderboards Dataset designed for text segmentation task. ### Languages The dataset is in Russian. ### Usage ```python from datasets import load_dataset dataset = load_dataset('mlenjoyneer/RuTextSegWiki') ``` ### Other datasets mlenjoyneer/RuTextSegNews - similar dataset based on news corpora ## Dataset Structure ### Data Instances For each instance, there is a list of strings for text sentences, a list of ints for labels (1 is new topic starting and 0 is previous topic continuation) and a string for sample generation method (base or random_joining). ### Data Splits | Dataset Split | Number of Instances in Split | |:---------|:---------| | Train | 20000 | | Test | 4000 | ## Additional Information ### Licensing Information In progress ### Citation Information ```bibtex In progress ```
philschmid/chip2_oasst1_en_code
--- dataset_info: features: - name: messages list: - name: content dtype: string - name: role dtype: string splits: - name: train num_bytes: 4934232 num_examples: 4687 download_size: 1866641 dataset_size: 4934232 --- # Dataset Card for "chip2_oasst1_en_code" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_maldv__electric-sheep-7b-alpha
--- pretty_name: Evaluation run of maldv/electric-sheep-7b-alpha dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [maldv/electric-sheep-7b-alpha](https://huggingface.co/maldv/electric-sheep-7b-alpha)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maldv__electric-sheep-7b-alpha\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-16T15:55:01.323956](https://huggingface.co/datasets/open-llm-leaderboard/details_maldv__electric-sheep-7b-alpha/blob/main/results_2024-03-16T15-55-01.323956.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5053488597835195,\n\ \ \"acc_stderr\": 0.0341231358277047,\n \"acc_norm\": 0.5096581303380655,\n\ \ \"acc_norm_stderr\": 0.03483649987748136,\n \"mc1\": 0.32068543451652387,\n\ \ \"mc1_stderr\": 0.0163391703732809,\n \"mc2\": 0.48261245911723205,\n\ \ \"mc2_stderr\": 0.015170591571160566\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5051194539249146,\n \"acc_stderr\": 0.014610624890309157,\n\ \ \"acc_norm\": 0.5486348122866894,\n \"acc_norm_stderr\": 0.014542104569955267\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5676160127464649,\n\ \ \"acc_stderr\": 0.004943945069611453,\n \"acc_norm\": 0.7642899820752838,\n\ \ \"acc_norm_stderr\": 0.004235743182042561\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\ \ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\ \ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.04065771002562605,\n\ \ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.04065771002562605\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\ \ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \ \ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5660377358490566,\n \"acc_stderr\": 0.03050329201334259,\n\ \ \"acc_norm\": 0.5660377358490566,\n \"acc_norm_stderr\": 0.03050329201334259\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n\ \ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.5416666666666666,\n\ \ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\ : 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n\ \ \"acc_stderr\": 0.03807301726504513,\n \"acc_norm\": 0.47398843930635837,\n\ \ \"acc_norm_stderr\": 0.03807301726504513\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\ \ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\ \ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n\ \ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\ \ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\ \ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\ \ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.35978835978835977,\n \"acc_stderr\": 0.02471807594412928,\n \"\ acc_norm\": 0.35978835978835977,\n \"acc_norm_stderr\": 0.02471807594412928\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\ \ \"acc_stderr\": 0.040061680838488774,\n \"acc_norm\": 0.2777777777777778,\n\ \ \"acc_norm_stderr\": 0.040061680838488774\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6193548387096774,\n\ \ \"acc_stderr\": 0.02762171783290703,\n \"acc_norm\": 0.6193548387096774,\n\ \ \"acc_norm_stderr\": 0.02762171783290703\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.3891625615763547,\n \"acc_stderr\": 0.03430462416103873,\n\ \ \"acc_norm\": 0.3891625615763547,\n \"acc_norm_stderr\": 0.03430462416103873\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\ : 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.03646204963253812,\n\ \ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.03646204963253812\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.6262626262626263,\n \"acc_stderr\": 0.03446897738659333,\n \"\ acc_norm\": 0.6262626262626263,\n \"acc_norm_stderr\": 0.03446897738659333\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7409326424870466,\n \"acc_stderr\": 0.031618779179354115,\n\ \ \"acc_norm\": 0.7409326424870466,\n \"acc_norm_stderr\": 0.031618779179354115\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.44871794871794873,\n \"acc_stderr\": 0.02521731518484648,\n\ \ \"acc_norm\": 0.44871794871794873,\n \"acc_norm_stderr\": 0.02521731518484648\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \ \ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.49159663865546216,\n \"acc_stderr\": 0.032473902765696686,\n\ \ \"acc_norm\": 0.49159663865546216,\n \"acc_norm_stderr\": 0.032473902765696686\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.23178807947019867,\n \"acc_stderr\": 0.03445406271987054,\n \"\ acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.03445406271987054\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.6587155963302752,\n \"acc_stderr\": 0.020328612816592446,\n \"\ acc_norm\": 0.6587155963302752,\n \"acc_norm_stderr\": 0.020328612816592446\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"\ acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.6568627450980392,\n \"acc_stderr\": 0.033321399446680854,\n \"\ acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.033321399446680854\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.6455696202531646,\n \"acc_stderr\": 0.03113730429718582,\n \ \ \"acc_norm\": 0.6455696202531646,\n \"acc_norm_stderr\": 0.03113730429718582\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\ \ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\ \ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.0435644720266507,\n\ \ \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.0435644720266507\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.6528925619834711,\n \"acc_stderr\": 0.04345724570292535,\n \"\ acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.04345724570292535\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6296296296296297,\n\ \ \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.6296296296296297,\n\ \ \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6319018404907976,\n \"acc_stderr\": 0.03789213935838396,\n\ \ \"acc_norm\": 0.6319018404907976,\n \"acc_norm_stderr\": 0.03789213935838396\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\ \ \"acc_stderr\": 0.044642857142857116,\n \"acc_norm\": 0.33035714285714285,\n\ \ \"acc_norm_stderr\": 0.044642857142857116\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n\ \ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7735042735042735,\n\ \ \"acc_stderr\": 0.027421007295392916,\n \"acc_norm\": 0.7735042735042735,\n\ \ \"acc_norm_stderr\": 0.027421007295392916\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \ \ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6845466155810983,\n\ \ \"acc_stderr\": 0.01661750173876339,\n \"acc_norm\": 0.6845466155810983,\n\ \ \"acc_norm_stderr\": 0.01661750173876339\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.02675625512966377,\n\ \ \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.02675625512966377\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2122905027932961,\n\ \ \"acc_stderr\": 0.013676644685831718,\n \"acc_norm\": 0.2122905027932961,\n\ \ \"acc_norm_stderr\": 0.013676644685831718\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5098039215686274,\n \"acc_stderr\": 0.028624412550167958,\n\ \ \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.028624412550167958\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5787781350482315,\n\ \ \"acc_stderr\": 0.028043399858210628,\n \"acc_norm\": 0.5787781350482315,\n\ \ \"acc_norm_stderr\": 0.028043399858210628\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.5709876543209876,\n \"acc_stderr\": 0.027538925613470863,\n\ \ \"acc_norm\": 0.5709876543209876,\n \"acc_norm_stderr\": 0.027538925613470863\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.3546099290780142,\n \"acc_stderr\": 0.028538650028878634,\n \ \ \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.028538650028878634\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3539765319426336,\n\ \ \"acc_stderr\": 0.01221350473173164,\n \"acc_norm\": 0.3539765319426336,\n\ \ \"acc_norm_stderr\": 0.01221350473173164\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121596,\n\ \ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121596\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5196078431372549,\n \"acc_stderr\": 0.020212274976302954,\n \ \ \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.020212274976302954\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\ \ \"acc_stderr\": 0.046737523336702384,\n \"acc_norm\": 0.6090909090909091,\n\ \ \"acc_norm_stderr\": 0.046737523336702384\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.5387755102040817,\n \"acc_stderr\": 0.031912820526692774,\n\ \ \"acc_norm\": 0.5387755102040817,\n \"acc_norm_stderr\": 0.031912820526692774\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n\ \ \"acc_stderr\": 0.03251006816458618,\n \"acc_norm\": 0.6965174129353234,\n\ \ \"acc_norm_stderr\": 0.03251006816458618\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \ \ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\ \ \"acc_stderr\": 0.03889951252827217,\n \"acc_norm\": 0.4819277108433735,\n\ \ \"acc_norm_stderr\": 0.03889951252827217\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.036155076303109365,\n\ \ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.036155076303109365\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32068543451652387,\n\ \ \"mc1_stderr\": 0.0163391703732809,\n \"mc2\": 0.48261245911723205,\n\ \ \"mc2_stderr\": 0.015170591571160566\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7032359905288083,\n \"acc_stderr\": 0.012839239695202032\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2934040940106141,\n \ \ \"acc_stderr\": 0.012541830815461487\n }\n}\n```" repo_url: https://huggingface.co/maldv/electric-sheep-7b-alpha leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|arc:challenge|25_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-16T15-55-01.323956.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|gsm8k|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hellaswag|10_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-16T15-55-01.323956.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-management|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T15-55-01.323956.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|truthfulqa:mc|0_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-16T15-55-01.323956.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_16T15_55_01.323956 path: - '**/details_harness|winogrande|5_2024-03-16T15-55-01.323956.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-16T15-55-01.323956.parquet' - config_name: results data_files: - split: 2024_03_16T15_55_01.323956 path: - results_2024-03-16T15-55-01.323956.parquet - split: latest path: - results_2024-03-16T15-55-01.323956.parquet --- # Dataset Card for Evaluation run of maldv/electric-sheep-7b-alpha <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [maldv/electric-sheep-7b-alpha](https://huggingface.co/maldv/electric-sheep-7b-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_maldv__electric-sheep-7b-alpha", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-16T15:55:01.323956](https://huggingface.co/datasets/open-llm-leaderboard/details_maldv__electric-sheep-7b-alpha/blob/main/results_2024-03-16T15-55-01.323956.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5053488597835195, "acc_stderr": 0.0341231358277047, "acc_norm": 0.5096581303380655, "acc_norm_stderr": 0.03483649987748136, "mc1": 0.32068543451652387, "mc1_stderr": 0.0163391703732809, "mc2": 0.48261245911723205, "mc2_stderr": 0.015170591571160566 }, "harness|arc:challenge|25": { "acc": 0.5051194539249146, "acc_stderr": 0.014610624890309157, "acc_norm": 0.5486348122866894, "acc_norm_stderr": 0.014542104569955267 }, "harness|hellaswag|10": { "acc": 0.5676160127464649, "acc_stderr": 0.004943945069611453, "acc_norm": 0.7642899820752838, "acc_norm_stderr": 0.004235743182042561 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.24, "acc_stderr": 0.04292346959909284, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4444444444444444, "acc_stderr": 0.04292596718256981, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.04292596718256981 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5197368421052632, "acc_stderr": 0.04065771002562605, "acc_norm": 0.5197368421052632, "acc_norm_stderr": 0.04065771002562605 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5660377358490566, "acc_stderr": 0.03050329201334259, "acc_norm": 0.5660377358490566, "acc_norm_stderr": 0.03050329201334259 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5416666666666666, "acc_stderr": 0.04166666666666665, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.04166666666666665 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.47398843930635837, "acc_stderr": 0.03807301726504513, "acc_norm": 0.47398843930635837, "acc_norm_stderr": 0.03807301726504513 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.27450980392156865, "acc_stderr": 0.04440521906179328, "acc_norm": 0.27450980392156865, "acc_norm_stderr": 0.04440521906179328 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.67, "acc_stderr": 0.04725815626252609, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252609 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.48936170212765956, "acc_stderr": 0.03267862331014063, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.03267862331014063 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.37719298245614036, "acc_stderr": 0.04559522141958216, "acc_norm": 0.37719298245614036, "acc_norm_stderr": 0.04559522141958216 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.45517241379310347, "acc_stderr": 0.04149886942192117, "acc_norm": 0.45517241379310347, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.35978835978835977, "acc_stderr": 0.02471807594412928, "acc_norm": 0.35978835978835977, "acc_norm_stderr": 0.02471807594412928 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2777777777777778, "acc_stderr": 0.040061680838488774, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.040061680838488774 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6193548387096774, "acc_stderr": 0.02762171783290703, "acc_norm": 0.6193548387096774, "acc_norm_stderr": 0.02762171783290703 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3891625615763547, "acc_stderr": 0.03430462416103873, "acc_norm": 0.3891625615763547, "acc_norm_stderr": 0.03430462416103873 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6787878787878788, "acc_stderr": 0.03646204963253812, "acc_norm": 0.6787878787878788, "acc_norm_stderr": 0.03646204963253812 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6262626262626263, "acc_stderr": 0.03446897738659333, "acc_norm": 0.6262626262626263, "acc_norm_stderr": 0.03446897738659333 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7409326424870466, "acc_stderr": 0.031618779179354115, "acc_norm": 0.7409326424870466, "acc_norm_stderr": 0.031618779179354115 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.44871794871794873, "acc_stderr": 0.02521731518484648, "acc_norm": 0.44871794871794873, "acc_norm_stderr": 0.02521731518484648 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.27037037037037037, "acc_stderr": 0.027080372815145668, "acc_norm": 0.27037037037037037, "acc_norm_stderr": 0.027080372815145668 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.49159663865546216, "acc_stderr": 0.032473902765696686, "acc_norm": 0.49159663865546216, "acc_norm_stderr": 0.032473902765696686 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.23178807947019867, "acc_stderr": 0.03445406271987054, "acc_norm": 0.23178807947019867, "acc_norm_stderr": 0.03445406271987054 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6587155963302752, "acc_stderr": 0.020328612816592446, "acc_norm": 0.6587155963302752, "acc_norm_stderr": 0.020328612816592446 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.39351851851851855, "acc_stderr": 0.03331747876370312, "acc_norm": 0.39351851851851855, "acc_norm_stderr": 0.03331747876370312 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6568627450980392, "acc_stderr": 0.033321399446680854, "acc_norm": 0.6568627450980392, "acc_norm_stderr": 0.033321399446680854 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6455696202531646, "acc_stderr": 0.03113730429718582, "acc_norm": 0.6455696202531646, "acc_norm_stderr": 0.03113730429718582 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6188340807174888, "acc_stderr": 0.03259625118416827, "acc_norm": 0.6188340807174888, "acc_norm_stderr": 0.03259625118416827 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5572519083969466, "acc_stderr": 0.0435644720266507, "acc_norm": 0.5572519083969466, "acc_norm_stderr": 0.0435644720266507 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6528925619834711, "acc_stderr": 0.04345724570292535, "acc_norm": 0.6528925619834711, "acc_norm_stderr": 0.04345724570292535 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6296296296296297, "acc_stderr": 0.04668408033024931, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.04668408033024931 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6319018404907976, "acc_stderr": 0.03789213935838396, "acc_norm": 0.6319018404907976, "acc_norm_stderr": 0.03789213935838396 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.33035714285714285, "acc_stderr": 0.044642857142857116, "acc_norm": 0.33035714285714285, "acc_norm_stderr": 0.044642857142857116 }, "harness|hendrycksTest-management|5": { "acc": 0.6601941747572816, "acc_stderr": 0.046897659372781335, "acc_norm": 0.6601941747572816, "acc_norm_stderr": 0.046897659372781335 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7735042735042735, "acc_stderr": 0.027421007295392916, "acc_norm": 0.7735042735042735, "acc_norm_stderr": 0.027421007295392916 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6845466155810983, "acc_stderr": 0.01661750173876339, "acc_norm": 0.6845466155810983, "acc_norm_stderr": 0.01661750173876339 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5549132947976878, "acc_stderr": 0.02675625512966377, "acc_norm": 0.5549132947976878, "acc_norm_stderr": 0.02675625512966377 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2122905027932961, "acc_stderr": 0.013676644685831718, "acc_norm": 0.2122905027932961, "acc_norm_stderr": 0.013676644685831718 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5098039215686274, "acc_stderr": 0.028624412550167958, "acc_norm": 0.5098039215686274, "acc_norm_stderr": 0.028624412550167958 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5787781350482315, "acc_stderr": 0.028043399858210628, "acc_norm": 0.5787781350482315, "acc_norm_stderr": 0.028043399858210628 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5709876543209876, "acc_stderr": 0.027538925613470863, "acc_norm": 0.5709876543209876, "acc_norm_stderr": 0.027538925613470863 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3546099290780142, "acc_stderr": 0.028538650028878634, "acc_norm": 0.3546099290780142, "acc_norm_stderr": 0.028538650028878634 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3539765319426336, "acc_stderr": 0.01221350473173164, "acc_norm": 0.3539765319426336, "acc_norm_stderr": 0.01221350473173164 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4485294117647059, "acc_stderr": 0.030211479609121596, "acc_norm": 0.4485294117647059, "acc_norm_stderr": 0.030211479609121596 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5196078431372549, "acc_stderr": 0.020212274976302954, "acc_norm": 0.5196078431372549, "acc_norm_stderr": 0.020212274976302954 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6090909090909091, "acc_stderr": 0.046737523336702384, "acc_norm": 0.6090909090909091, "acc_norm_stderr": 0.046737523336702384 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5387755102040817, "acc_stderr": 0.031912820526692774, "acc_norm": 0.5387755102040817, "acc_norm_stderr": 0.031912820526692774 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6965174129353234, "acc_stderr": 0.03251006816458618, "acc_norm": 0.6965174129353234, "acc_norm_stderr": 0.03251006816458618 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.76, "acc_stderr": 0.04292346959909283, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-virology|5": { "acc": 0.4819277108433735, "acc_stderr": 0.03889951252827217, "acc_norm": 0.4819277108433735, "acc_norm_stderr": 0.03889951252827217 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6666666666666666, "acc_stderr": 0.036155076303109365, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.036155076303109365 }, "harness|truthfulqa:mc|0": { "mc1": 0.32068543451652387, "mc1_stderr": 0.0163391703732809, "mc2": 0.48261245911723205, "mc2_stderr": 0.015170591571160566 }, "harness|winogrande|5": { "acc": 0.7032359905288083, "acc_stderr": 0.012839239695202032 }, "harness|gsm8k|5": { "acc": 0.2934040940106141, "acc_stderr": 0.012541830815461487 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
CyberHarem/sense_sousounofrieren
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of Sense/ゼンゼ (Sousou no Frieren) This is the dataset of Sense/ゼンゼ (Sousou no Frieren), containing 85 images and their tags. The core tags of this character are `long_hair, brown_hair, bow, white_bow, hair_between_eyes, very_long_hair, brown_eyes, ahoge`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 85 | 49.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sense_sousounofrieren/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 1200 | 85 | 49.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sense_sousounofrieren/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 127 | 77.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sense_sousounofrieren/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/sense_sousounofrieren', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------| | 0 | 27 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, expressionless, solo, closed_mouth, looking_at_viewer, upper_body, white_bowtie, capelet | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | expressionless | solo | closed_mouth | looking_at_viewer | upper_body | white_bowtie | capelet | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------|:---------------|:--------------------|:-------------|:---------------|:----------| | 0 | 27 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X |
open-llm-leaderboard/details_chargoddard__ypotryll-22b-epoch2-qlora
--- pretty_name: Evaluation run of chargoddard/ypotryll-22b-epoch2-qlora dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [chargoddard/ypotryll-22b-epoch2-qlora](https://huggingface.co/chargoddard/ypotryll-22b-epoch2-qlora)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__ypotryll-22b-epoch2-qlora\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-09-26T17:07:11.654928](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__ypotryll-22b-epoch2-qlora/blob/main/results_2023-09-26T17-07-11.654928.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.39198825503355705,\n\ \ \"em_stderr\": 0.004999564353850857,\n \"f1\": 0.452352139261747,\n\ \ \"f1_stderr\": 0.004826380442768646,\n \"acc\": 0.4085244316417271,\n\ \ \"acc_stderr\": 0.00908196050272276\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.39198825503355705,\n \"em_stderr\": 0.004999564353850857,\n\ \ \"f1\": 0.452352139261747,\n \"f1_stderr\": 0.004826380442768646\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.053828658074298714,\n \ \ \"acc_stderr\": 0.006216328640238116\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7632202052091555,\n \"acc_stderr\": 0.0119475923652074\n\ \ }\n}\n```" repo_url: https://huggingface.co/chargoddard/ypotryll-22b-epoch2-qlora leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|arc:challenge|25_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|arc:challenge|25_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-08-18T22:33:04.843641.parquet' - config_name: harness_drop_3 data_files: - split: 2023_09_26T17_07_11.654928 path: - '**/details_harness|drop|3_2023-09-26T17-07-11.654928.parquet' - split: latest path: - '**/details_harness|drop|3_2023-09-26T17-07-11.654928.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_09_26T17_07_11.654928 path: - '**/details_harness|gsm8k|5_2023-09-26T17-07-11.654928.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-09-26T17-07-11.654928.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hellaswag|10_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hellaswag|10_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-18T22:24:06.867434.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-18T22:33:04.843641.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-management|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-management|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-virology|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-virology|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T22:33:04.843641.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_08_18T22_24_06.867434 path: - '**/details_harness|truthfulqa:mc|0_2023-08-18T22:24:06.867434.parquet' - split: 2023_08_18T22_33_04.843641 path: - '**/details_harness|truthfulqa:mc|0_2023-08-18T22:33:04.843641.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-08-18T22:33:04.843641.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_09_26T17_07_11.654928 path: - '**/details_harness|winogrande|5_2023-09-26T17-07-11.654928.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-09-26T17-07-11.654928.parquet' - config_name: results data_files: - split: 2023_09_26T17_07_11.654928 path: - results_2023-09-26T17-07-11.654928.parquet - split: latest path: - results_2023-09-26T17-07-11.654928.parquet --- # Dataset Card for Evaluation run of chargoddard/ypotryll-22b-epoch2-qlora ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/chargoddard/ypotryll-22b-epoch2-qlora - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [chargoddard/ypotryll-22b-epoch2-qlora](https://huggingface.co/chargoddard/ypotryll-22b-epoch2-qlora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_chargoddard__ypotryll-22b-epoch2-qlora", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-26T17:07:11.654928](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__ypotryll-22b-epoch2-qlora/blob/main/results_2023-09-26T17-07-11.654928.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.39198825503355705, "em_stderr": 0.004999564353850857, "f1": 0.452352139261747, "f1_stderr": 0.004826380442768646, "acc": 0.4085244316417271, "acc_stderr": 0.00908196050272276 }, "harness|drop|3": { "em": 0.39198825503355705, "em_stderr": 0.004999564353850857, "f1": 0.452352139261747, "f1_stderr": 0.004826380442768646 }, "harness|gsm8k|5": { "acc": 0.053828658074298714, "acc_stderr": 0.006216328640238116 }, "harness|winogrande|5": { "acc": 0.7632202052091555, "acc_stderr": 0.0119475923652074 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]