datasetId
stringlengths
2
117
card
stringlengths
19
1.01M
higgsfield/dsml_original_loc
--- dataset_info: features: - name: prompt dtype: string - name: completion dtype: string splits: - name: train num_bytes: 243684731 num_examples: 32477 download_size: 27760890 dataset_size: 243684731 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "dsml_original_loc" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mekaneeky/salt-llama-lgg-to-eng
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: dev path: data/dev-* - split: test path: data/test-* dataset_info: features: - name: ID dtype: string - name: text dtype: string splits: - name: train num_bytes: 4130369 num_examples: 23947 - name: dev num_bytes: 85575 num_examples: 500 - name: test num_bytes: 87440 num_examples: 500 download_size: 2324474 dataset_size: 4303384 --- # Dataset Card for "salt-llama-lgg-to-eng" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Nan-Do/instructional_code-search-net-go
--- dataset_info: features: - name: INSTRUCTION dtype: string - name: RESPONSE dtype: string - name: SOURCE dtype: string splits: - name: train num_bytes: 122612124 num_examples: 203128 download_size: 45476654 dataset_size: 122612124 --- # Dataset Card for "instructional_code-search-net-go" IT STILL REQUIRES MORE WORK. PLEASE DON'T USE IT
AdapterOcean/data-standardized_cluster_22_alpaca
--- dataset_info: features: - name: input dtype: string - name: output dtype: string splits: - name: train num_bytes: 26759579 num_examples: 12736 download_size: 11363431 dataset_size: 26759579 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "data-standardized_cluster_22_alpaca" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_deepseek-ai__deepseek-math-7b-instruct
--- pretty_name: Evaluation run of deepseek-ai/deepseek-math-7b-instruct dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [deepseek-ai/deepseek-math-7b-instruct](https://huggingface.co/deepseek-ai/deepseek-math-7b-instruct)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_deepseek-ai__deepseek-math-7b-instruct\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-13T18:13:18.094811](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-math-7b-instruct/blob/main/results_2024-03-13T18-13-18.094811.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.554019303836701,\n\ \ \"acc_stderr\": 0.034497761386808885,\n \"acc_norm\": 0.5618809813979222,\n\ \ \"acc_norm_stderr\": 0.035247964085412954,\n \"mc1\": 0.29253365973072215,\n\ \ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.40156731347428204,\n\ \ \"mc2_stderr\": 0.014934119039002425\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5008532423208191,\n \"acc_stderr\": 0.014611369529813269,\n\ \ \"acc_norm\": 0.5341296928327645,\n \"acc_norm_stderr\": 0.014577311315231108\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5409281019717188,\n\ \ \"acc_stderr\": 0.004973036453863722,\n \"acc_norm\": 0.7149970125473013,\n\ \ \"acc_norm_stderr\": 0.004504932999736403\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\ \ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\ \ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n\ \ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\ \ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \ \ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5358490566037736,\n \"acc_stderr\": 0.030693675018458003,\n\ \ \"acc_norm\": 0.5358490566037736,\n \"acc_norm_stderr\": 0.030693675018458003\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\ \ \"acc_stderr\": 0.04101405519842425,\n \"acc_norm\": 0.5972222222222222,\n\ \ \"acc_norm_stderr\": 0.04101405519842425\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \ \ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\ : 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \ \ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\ \ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n\ \ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\ \ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\ \ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.6425531914893617,\n \"acc_stderr\": 0.031329417894764254,\n\ \ \"acc_norm\": 0.6425531914893617,\n \"acc_norm_stderr\": 0.031329417894764254\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\ \ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\ \ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.03996629574876718,\n\ \ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.03996629574876718\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.5661375661375662,\n \"acc_stderr\": 0.025525034382474894,\n \"\ acc_norm\": 0.5661375661375662,\n \"acc_norm_stderr\": 0.025525034382474894\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\ \ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \ \ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6451612903225806,\n\ \ \"acc_stderr\": 0.027218889773308757,\n \"acc_norm\": 0.6451612903225806,\n\ \ \"acc_norm_stderr\": 0.027218889773308757\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\ \ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\ : 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.036462049632538115,\n\ \ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.036462049632538115\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\ acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.6787564766839378,\n \"acc_stderr\": 0.033699508685490674,\n\ \ \"acc_norm\": 0.6787564766839378,\n \"acc_norm_stderr\": 0.033699508685490674\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.025069094387296535,\n\ \ \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.025069094387296535\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.40370370370370373,\n \"acc_stderr\": 0.029914812342227624,\n \ \ \"acc_norm\": 0.40370370370370373,\n \"acc_norm_stderr\": 0.029914812342227624\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977927,\n\ \ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977927\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.4105960264900662,\n \"acc_stderr\": 0.04016689594849927,\n \"\ acc_norm\": 0.4105960264900662,\n \"acc_norm_stderr\": 0.04016689594849927\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7376146788990826,\n \"acc_stderr\": 0.018861885021534727,\n \"\ acc_norm\": 0.7376146788990826,\n \"acc_norm_stderr\": 0.018861885021534727\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5787037037037037,\n \"acc_stderr\": 0.03367462138896078,\n \"\ acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.03367462138896078\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.5490196078431373,\n \"acc_stderr\": 0.03492406104163613,\n \"\ acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.03492406104163613\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.6708860759493671,\n \"acc_stderr\": 0.030587326294702354,\n \ \ \"acc_norm\": 0.6708860759493671,\n \"acc_norm_stderr\": 0.030587326294702354\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5426008968609866,\n\ \ \"acc_stderr\": 0.03343577705583065,\n \"acc_norm\": 0.5426008968609866,\n\ \ \"acc_norm_stderr\": 0.03343577705583065\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\ \ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.6611570247933884,\n \"acc_stderr\": 0.043207678075366705,\n \"\ acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.043207678075366705\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\ \ \"acc_stderr\": 0.04643454608906276,\n \"acc_norm\": 0.6388888888888888,\n\ \ \"acc_norm_stderr\": 0.04643454608906276\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\ \ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\ \ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\ \ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.0458212416016155,\n\ \ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.0458212416016155\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8034188034188035,\n\ \ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.8034188034188035,\n\ \ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \ \ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6730523627075351,\n\ \ \"acc_stderr\": 0.016774908180131463,\n \"acc_norm\": 0.6730523627075351,\n\ \ \"acc_norm_stderr\": 0.016774908180131463\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.02668013476167922,\n\ \ \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.02668013476167922\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29497206703910617,\n\ \ \"acc_stderr\": 0.015251931579208199,\n \"acc_norm\": 0.29497206703910617,\n\ \ \"acc_norm_stderr\": 0.015251931579208199\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.028580341065138296,\n\ \ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.028580341065138296\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5819935691318328,\n\ \ \"acc_stderr\": 0.028013651891995076,\n \"acc_norm\": 0.5819935691318328,\n\ \ \"acc_norm_stderr\": 0.028013651891995076\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.027801656212323667,\n\ \ \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.027801656212323667\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.37943262411347517,\n \"acc_stderr\": 0.0289473388516141,\n \ \ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.0289473388516141\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38461538461538464,\n\ \ \"acc_stderr\": 0.01242554841630294,\n \"acc_norm\": 0.38461538461538464,\n\ \ \"acc_norm_stderr\": 0.01242554841630294\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.41544117647058826,\n \"acc_stderr\": 0.029935342707877746,\n\ \ \"acc_norm\": 0.41544117647058826,\n \"acc_norm_stderr\": 0.029935342707877746\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5081699346405228,\n \"acc_stderr\": 0.02022513434305727,\n \ \ \"acc_norm\": 0.5081699346405228,\n \"acc_norm_stderr\": 0.02022513434305727\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\ \ \"acc_stderr\": 0.04724577405731571,\n \"acc_norm\": 0.5818181818181818,\n\ \ \"acc_norm_stderr\": 0.04724577405731571\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.03063565515038764,\n\ \ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.03063565515038764\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\ \ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\ \ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\ \ \"acc_stderr\": 0.03828401115079021,\n \"acc_norm\": 0.40963855421686746,\n\ \ \"acc_norm_stderr\": 0.03828401115079021\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.03786720706234214,\n\ \ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.03786720706234214\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29253365973072215,\n\ \ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.40156731347428204,\n\ \ \"mc2_stderr\": 0.014934119039002425\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.6574585635359116,\n \"acc_stderr\": 0.013337483579075923\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19408642911296436,\n \ \ \"acc_stderr\": 0.010893918308192413\n }\n}\n```" repo_url: https://huggingface.co/deepseek-ai/deepseek-math-7b-instruct leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|arc:challenge|25_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|arc:challenge|25_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-13T18-13-18.094811.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|gsm8k|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|gsm8k|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hellaswag|10_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hellaswag|10_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-12T07-42-48.023389.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-13T18-13-18.094811.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-management|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-management|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T18-13-18.094811.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|truthfulqa:mc|0_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|truthfulqa:mc|0_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-13T18-13-18.094811.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_12T07_42_48.023389 path: - '**/details_harness|winogrande|5_2024-03-12T07-42-48.023389.parquet' - split: 2024_03_13T18_13_18.094811 path: - '**/details_harness|winogrande|5_2024-03-13T18-13-18.094811.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-13T18-13-18.094811.parquet' - config_name: results data_files: - split: 2024_03_12T07_42_48.023389 path: - results_2024-03-12T07-42-48.023389.parquet - split: 2024_03_13T18_13_18.094811 path: - results_2024-03-13T18-13-18.094811.parquet - split: latest path: - results_2024-03-13T18-13-18.094811.parquet --- # Dataset Card for Evaluation run of deepseek-ai/deepseek-math-7b-instruct <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [deepseek-ai/deepseek-math-7b-instruct](https://huggingface.co/deepseek-ai/deepseek-math-7b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_deepseek-ai__deepseek-math-7b-instruct", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-13T18:13:18.094811](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-math-7b-instruct/blob/main/results_2024-03-13T18-13-18.094811.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.554019303836701, "acc_stderr": 0.034497761386808885, "acc_norm": 0.5618809813979222, "acc_norm_stderr": 0.035247964085412954, "mc1": 0.29253365973072215, "mc1_stderr": 0.015925597445286165, "mc2": 0.40156731347428204, "mc2_stderr": 0.014934119039002425 }, "harness|arc:challenge|25": { "acc": 0.5008532423208191, "acc_stderr": 0.014611369529813269, "acc_norm": 0.5341296928327645, "acc_norm_stderr": 0.014577311315231108 }, "harness|hellaswag|10": { "acc": 0.5409281019717188, "acc_stderr": 0.004973036453863722, "acc_norm": 0.7149970125473013, "acc_norm_stderr": 0.004504932999736403 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.04793724854411021, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411021 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4222222222222222, "acc_stderr": 0.04266763404099582, "acc_norm": 0.4222222222222222, "acc_norm_stderr": 0.04266763404099582 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6381578947368421, "acc_stderr": 0.03910525752849724, "acc_norm": 0.6381578947368421, "acc_norm_stderr": 0.03910525752849724 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5358490566037736, "acc_stderr": 0.030693675018458003, "acc_norm": 0.5358490566037736, "acc_norm_stderr": 0.030693675018458003 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5972222222222222, "acc_stderr": 0.04101405519842425, "acc_norm": 0.5972222222222222, "acc_norm_stderr": 0.04101405519842425 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5375722543352601, "acc_stderr": 0.0380168510452446, "acc_norm": 0.5375722543352601, "acc_norm_stderr": 0.0380168510452446 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107223, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107223 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6425531914893617, "acc_stderr": 0.031329417894764254, "acc_norm": 0.6425531914893617, "acc_norm_stderr": 0.031329417894764254 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.046970851366478626, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6413793103448275, "acc_stderr": 0.03996629574876718, "acc_norm": 0.6413793103448275, "acc_norm_stderr": 0.03996629574876718 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.5661375661375662, "acc_stderr": 0.025525034382474894, "acc_norm": 0.5661375661375662, "acc_norm_stderr": 0.025525034382474894 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5, "acc_stderr": 0.04472135954999579, "acc_norm": 0.5, "acc_norm_stderr": 0.04472135954999579 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.28, "acc_stderr": 0.04512608598542126, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542126 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6451612903225806, "acc_stderr": 0.027218889773308757, "acc_norm": 0.6451612903225806, "acc_norm_stderr": 0.027218889773308757 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.541871921182266, "acc_stderr": 0.03505630140785741, "acc_norm": 0.541871921182266, "acc_norm_stderr": 0.03505630140785741 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6787878787878788, "acc_stderr": 0.036462049632538115, "acc_norm": 0.6787878787878788, "acc_norm_stderr": 0.036462049632538115 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6868686868686869, "acc_stderr": 0.033042050878136525, "acc_norm": 0.6868686868686869, "acc_norm_stderr": 0.033042050878136525 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.6787564766839378, "acc_stderr": 0.033699508685490674, "acc_norm": 0.6787564766839378, "acc_norm_stderr": 0.033699508685490674 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5743589743589743, "acc_stderr": 0.025069094387296535, "acc_norm": 0.5743589743589743, "acc_norm_stderr": 0.025069094387296535 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.40370370370370373, "acc_stderr": 0.029914812342227624, "acc_norm": 0.40370370370370373, "acc_norm_stderr": 0.029914812342227624 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.030066761582977927, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.030066761582977927 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.4105960264900662, "acc_stderr": 0.04016689594849927, "acc_norm": 0.4105960264900662, "acc_norm_stderr": 0.04016689594849927 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7376146788990826, "acc_stderr": 0.018861885021534727, "acc_norm": 0.7376146788990826, "acc_norm_stderr": 0.018861885021534727 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5787037037037037, "acc_stderr": 0.03367462138896078, "acc_norm": 0.5787037037037037, "acc_norm_stderr": 0.03367462138896078 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.5490196078431373, "acc_stderr": 0.03492406104163613, "acc_norm": 0.5490196078431373, "acc_norm_stderr": 0.03492406104163613 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6708860759493671, "acc_stderr": 0.030587326294702354, "acc_norm": 0.6708860759493671, "acc_norm_stderr": 0.030587326294702354 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5426008968609866, "acc_stderr": 0.03343577705583065, "acc_norm": 0.5426008968609866, "acc_norm_stderr": 0.03343577705583065 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5954198473282443, "acc_stderr": 0.043046937953806645, "acc_norm": 0.5954198473282443, "acc_norm_stderr": 0.043046937953806645 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6611570247933884, "acc_stderr": 0.043207678075366705, "acc_norm": 0.6611570247933884, "acc_norm_stderr": 0.043207678075366705 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6388888888888888, "acc_stderr": 0.04643454608906276, "acc_norm": 0.6388888888888888, "acc_norm_stderr": 0.04643454608906276 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6687116564417178, "acc_stderr": 0.03697983910025588, "acc_norm": 0.6687116564417178, "acc_norm_stderr": 0.03697983910025588 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.6893203883495146, "acc_stderr": 0.0458212416016155, "acc_norm": 0.6893203883495146, "acc_norm_stderr": 0.0458212416016155 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8034188034188035, "acc_stderr": 0.02603538609895129, "acc_norm": 0.8034188034188035, "acc_norm_stderr": 0.02603538609895129 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6730523627075351, "acc_stderr": 0.016774908180131463, "acc_norm": 0.6730523627075351, "acc_norm_stderr": 0.016774908180131463 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5664739884393064, "acc_stderr": 0.02668013476167922, "acc_norm": 0.5664739884393064, "acc_norm_stderr": 0.02668013476167922 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.29497206703910617, "acc_stderr": 0.015251931579208199, "acc_norm": 0.29497206703910617, "acc_norm_stderr": 0.015251931579208199 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5294117647058824, "acc_stderr": 0.028580341065138296, "acc_norm": 0.5294117647058824, "acc_norm_stderr": 0.028580341065138296 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5819935691318328, "acc_stderr": 0.028013651891995076, "acc_norm": 0.5819935691318328, "acc_norm_stderr": 0.028013651891995076 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5185185185185185, "acc_stderr": 0.027801656212323667, "acc_norm": 0.5185185185185185, "acc_norm_stderr": 0.027801656212323667 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.37943262411347517, "acc_stderr": 0.0289473388516141, "acc_norm": 0.37943262411347517, "acc_norm_stderr": 0.0289473388516141 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.38461538461538464, "acc_stderr": 0.01242554841630294, "acc_norm": 0.38461538461538464, "acc_norm_stderr": 0.01242554841630294 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.41544117647058826, "acc_stderr": 0.029935342707877746, "acc_norm": 0.41544117647058826, "acc_norm_stderr": 0.029935342707877746 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5081699346405228, "acc_stderr": 0.02022513434305727, "acc_norm": 0.5081699346405228, "acc_norm_stderr": 0.02022513434305727 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5818181818181818, "acc_stderr": 0.04724577405731571, "acc_norm": 0.5818181818181818, "acc_norm_stderr": 0.04724577405731571 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6448979591836734, "acc_stderr": 0.03063565515038764, "acc_norm": 0.6448979591836734, "acc_norm_stderr": 0.03063565515038764 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7313432835820896, "acc_stderr": 0.03134328358208954, "acc_norm": 0.7313432835820896, "acc_norm_stderr": 0.03134328358208954 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-virology|5": { "acc": 0.40963855421686746, "acc_stderr": 0.03828401115079021, "acc_norm": 0.40963855421686746, "acc_norm_stderr": 0.03828401115079021 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.5789473684210527, "acc_stderr": 0.03786720706234214, "acc_norm": 0.5789473684210527, "acc_norm_stderr": 0.03786720706234214 }, "harness|truthfulqa:mc|0": { "mc1": 0.29253365973072215, "mc1_stderr": 0.015925597445286165, "mc2": 0.40156731347428204, "mc2_stderr": 0.014934119039002425 }, "harness|winogrande|5": { "acc": 0.6574585635359116, "acc_stderr": 0.013337483579075923 }, "harness|gsm8k|5": { "acc": 0.19408642911296436, "acc_stderr": 0.010893918308192413 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
arazd/tulu_dolly
--- license: openrail ---
paduraru2009/imdb-sample2
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* dataset_info: features: - name: label dtype: int64 - name: text dtype: string splits: - name: train num_bytes: 40107731 num_examples: 30000 - name: validation num_bytes: 39127084 num_examples: 30000 download_size: 50593468 dataset_size: 79234815 --- # Dataset Card for "imdb-sample2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-elementary_mathematics
--- dataset_info: features: - name: question dtype: string - name: choices sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: negate_openai_prompt struct: - name: content dtype: string - name: role dtype: string - name: neg_question dtype: string - name: fewshot_context dtype: string - name: fewshot_context_neg dtype: string splits: - name: dev num_bytes: 4869 num_examples: 5 - name: test num_bytes: 1253265 num_examples: 378 download_size: 136289 dataset_size: 1258134 configs: - config_name: default data_files: - split: dev path: data/dev-* - split: test path: data/test-* --- # Dataset Card for "mmlu-elementary_mathematics" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CyberHarem/rapi_nikke
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of rapi/ラピ/拉毗/라피 (Nikke: Goddess of Victory) This is the dataset of rapi/ラピ/拉毗/라피 (Nikke: Goddess of Victory), containing 381 images and their tags. The core tags of this character are `long_hair, breasts, bangs, brown_hair, hat, beret, large_breasts, black_headwear, red_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 381 | 688.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rapi_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 381 | 344.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rapi_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 985 | 751.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rapi_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 381 | 586.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rapi_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 985 | 1.10 GiB | [Download](https://huggingface.co/datasets/CyberHarem/rapi_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/rapi_nikke', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, black_jacket, black_thighhighs, long_sleeves, looking_at_viewer, red_necktie, solo, thighs, medium_breasts, open_jacket, parted_lips, red_gloves, sitting, belt_pouch, black_leotard, black_shirt, brown_thighhighs, closed_mouth, cropped_jacket, feet_out_of_frame | | 1 | 14 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_thighhighs, solo, black_jacket, long_sleeves, black_leotard, closed_mouth, holding_gun, looking_at_viewer, pouch, black_gloves, rifle, red_necktie, ammunition_belt, cropped_jacket, orange_eyes, thighs | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, black_panties, black_thighhighs, long_sleeves, looking_at_viewer, solo, white_background, ass_focus, black_gloves, blush, from_behind, simple_background, thighs, black_jacket, looking_back, skindentation, from_below, hand_on_own_ass, thong, underboob | | 3 | 17 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, looking_at_viewer, red_necktie, black_jacket, solo, upper_body, grey_shirt, open_mouth, orange_eyes, black_choker, blush, brown_eyes, blurry, gloves, open_jacket, simple_background, white_background, hair_between_eyes, long_sleeves, uniform | | 4 | 16 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1boy, blush, hetero, 1girl, mosaic_censoring, open_mouth, penis, solo_focus, looking_back, long_sleeves, black_jacket, black_panties, cum_on_ass, sex, thong, ejaculation, girl_on_top, cum_on_body, looking_at_viewer, straddling, thighhighs, thighs | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1boy, 1girl, hetero, penis, solo_focus, fellatio, mosaic_censoring, erection, long_sleeves, looking_at_viewer, :>=, black_gloves, black_jacket, blush, pov, sitting | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1boy, 1girl, black_jacket, fellatio, from_side, hetero, uncensored, clothed_female_nude_male, erection, closed_eyes, huge_breasts, solo_focus, star_(symbol), black_thighhighs, grey_bodysuit, grey_gloves, grey_headwear, grey_leotard, grey_thighhighs, kissing_penis, sitting | | 7 | 16 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, solo, thighs, competition_swimsuit, highleg_swimsuit, looking_at_viewer, outdoors, blue_sky, day, blush, closed_mouth, bare_shoulders, cloud, cowboy_shot, goggles_on_head, bare_arms, black_one-piece_swimsuit, covered_navel, wet, cleavage, hair_intakes, water, beach, ocean, black_choker, standing, very_long_hair, ass, collarbone, eyewear_on_head, looking_back | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_jacket | black_thighhighs | long_sleeves | looking_at_viewer | red_necktie | solo | thighs | medium_breasts | open_jacket | parted_lips | red_gloves | sitting | belt_pouch | black_leotard | black_shirt | brown_thighhighs | closed_mouth | cropped_jacket | feet_out_of_frame | holding_gun | pouch | black_gloves | rifle | ammunition_belt | orange_eyes | black_panties | white_background | ass_focus | blush | from_behind | simple_background | looking_back | skindentation | from_below | hand_on_own_ass | thong | underboob | upper_body | grey_shirt | open_mouth | black_choker | brown_eyes | blurry | gloves | hair_between_eyes | uniform | 1boy | hetero | mosaic_censoring | penis | solo_focus | cum_on_ass | sex | ejaculation | girl_on_top | cum_on_body | straddling | thighhighs | fellatio | erection | :>= | pov | from_side | uncensored | clothed_female_nude_male | closed_eyes | huge_breasts | star_(symbol) | grey_bodysuit | grey_gloves | grey_headwear | grey_leotard | grey_thighhighs | kissing_penis | competition_swimsuit | highleg_swimsuit | outdoors | blue_sky | day | bare_shoulders | cloud | cowboy_shot | goggles_on_head | bare_arms | black_one-piece_swimsuit | covered_navel | wet | cleavage | hair_intakes | water | beach | ocean | standing | very_long_hair | ass | collarbone | eyewear_on_head | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------------------|:---------------|:--------------------|:--------------|:-------|:---------|:-----------------|:--------------|:--------------|:-------------|:----------|:-------------|:----------------|:--------------|:-------------------|:---------------|:-----------------|:--------------------|:--------------|:--------|:---------------|:--------|:------------------|:--------------|:----------------|:-------------------|:------------|:--------|:--------------|:--------------------|:---------------|:----------------|:-------------|:------------------|:--------|:------------|:-------------|:-------------|:-------------|:---------------|:-------------|:---------|:---------|:--------------------|:----------|:-------|:---------|:-------------------|:--------|:-------------|:-------------|:------|:--------------|:--------------|:--------------|:-------------|:-------------|:-----------|:-----------|:------|:------|:------------|:-------------|:---------------------------|:--------------|:---------------|:----------------|:----------------|:--------------|:----------------|:---------------|:------------------|:----------------|:-----------------------|:-------------------|:-----------|:-----------|:------|:-----------------|:--------|:--------------|:------------------|:------------|:---------------------------|:----------------|:------|:-----------|:---------------|:--------|:--------|:--------|:-----------|:-----------------|:------|:-------------|:------------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 14 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | X | | | | | | | X | | | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | | X | X | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 17 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | X | X | X | X | | | X | | | | | | | | | | | | | | | | X | | X | | X | | X | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 16 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | X | X | | | X | | | | | | | | | | | | | | | | | | | X | | | X | | | X | | | | X | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | | X | X | | | | | | | | X | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | X | | | | | | | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 16 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | | | X | | X | X | | | | | | | | | | X | | | | | | | | | | | | X | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
rr/DDR
--- license: pddl ---
kalivoda/dataset_easy_ocr_v0.3.0_clean
--- dataset_info: features: - name: id dtype: string - name: words sequence: string - name: bboxes sequence: sequence: float32 - name: image_path dtype: string - name: ner_tags sequence: class_label: names: '0': DIC '1': IBAN '2': ICO '3': O '4': account_number '5': bank_code '6': const_symbol '7': contr_address '8': contr_name '9': due_date '10': invoice_date '11': invoice_number '12': qr_code '13': spec_symbol '14': total_amount '15': var_symbol splits: - name: train num_bytes: 20705074 num_examples: 2523 - name: val num_bytes: 2370943 num_examples: 280 download_size: 7037725 dataset_size: 23076017 --- # Dataset Card for "dataset_easy_ocr_v0.3.0_clean" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
BangumiBase/dorohedoro
--- license: mit tags: - art size_categories: - 1K<n<10K --- # Bangumi Image Base of Dorohedoro This is the image base of bangumi Dorohedoro, we detected 23 characters, 1018 images in total. The full dataset is [here](all.zip). **Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview: | # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 | |:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------| | 0 | 140 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) | | 1 | 49 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) | | 2 | 35 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | ![preview 7](2/preview_7.png) | ![preview 8](2/preview_8.png) | | 3 | 19 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | ![preview 7](3/preview_7.png) | ![preview 8](3/preview_8.png) | | 4 | 107 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | ![preview 8](4/preview_8.png) | | 5 | 114 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) | | 6 | 43 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) | | 7 | 23 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | ![preview 8](7/preview_8.png) | | 8 | 77 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) | | 9 | 52 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) | | 10 | 45 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) | | 11 | 29 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | ![preview 8](11/preview_8.png) | | 12 | 18 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) | | 13 | 22 | [Download](13/dataset.zip) | ![preview 1](13/preview_1.png) | ![preview 2](13/preview_2.png) | ![preview 3](13/preview_3.png) | ![preview 4](13/preview_4.png) | ![preview 5](13/preview_5.png) | ![preview 6](13/preview_6.png) | ![preview 7](13/preview_7.png) | ![preview 8](13/preview_8.png) | | 14 | 57 | [Download](14/dataset.zip) | ![preview 1](14/preview_1.png) | ![preview 2](14/preview_2.png) | ![preview 3](14/preview_3.png) | ![preview 4](14/preview_4.png) | ![preview 5](14/preview_5.png) | ![preview 6](14/preview_6.png) | ![preview 7](14/preview_7.png) | ![preview 8](14/preview_8.png) | | 15 | 16 | [Download](15/dataset.zip) | ![preview 1](15/preview_1.png) | ![preview 2](15/preview_2.png) | ![preview 3](15/preview_3.png) | ![preview 4](15/preview_4.png) | ![preview 5](15/preview_5.png) | ![preview 6](15/preview_6.png) | ![preview 7](15/preview_7.png) | ![preview 8](15/preview_8.png) | | 16 | 23 | [Download](16/dataset.zip) | ![preview 1](16/preview_1.png) | ![preview 2](16/preview_2.png) | ![preview 3](16/preview_3.png) | ![preview 4](16/preview_4.png) | ![preview 5](16/preview_5.png) | ![preview 6](16/preview_6.png) | ![preview 7](16/preview_7.png) | ![preview 8](16/preview_8.png) | | 17 | 37 | [Download](17/dataset.zip) | ![preview 1](17/preview_1.png) | ![preview 2](17/preview_2.png) | ![preview 3](17/preview_3.png) | ![preview 4](17/preview_4.png) | ![preview 5](17/preview_5.png) | ![preview 6](17/preview_6.png) | ![preview 7](17/preview_7.png) | ![preview 8](17/preview_8.png) | | 18 | 8 | [Download](18/dataset.zip) | ![preview 1](18/preview_1.png) | ![preview 2](18/preview_2.png) | ![preview 3](18/preview_3.png) | ![preview 4](18/preview_4.png) | ![preview 5](18/preview_5.png) | ![preview 6](18/preview_6.png) | ![preview 7](18/preview_7.png) | ![preview 8](18/preview_8.png) | | 19 | 19 | [Download](19/dataset.zip) | ![preview 1](19/preview_1.png) | ![preview 2](19/preview_2.png) | ![preview 3](19/preview_3.png) | ![preview 4](19/preview_4.png) | ![preview 5](19/preview_5.png) | ![preview 6](19/preview_6.png) | ![preview 7](19/preview_7.png) | ![preview 8](19/preview_8.png) | | 20 | 12 | [Download](20/dataset.zip) | ![preview 1](20/preview_1.png) | ![preview 2](20/preview_2.png) | ![preview 3](20/preview_3.png) | ![preview 4](20/preview_4.png) | ![preview 5](20/preview_5.png) | ![preview 6](20/preview_6.png) | ![preview 7](20/preview_7.png) | ![preview 8](20/preview_8.png) | | 21 | 10 | [Download](21/dataset.zip) | ![preview 1](21/preview_1.png) | ![preview 2](21/preview_2.png) | ![preview 3](21/preview_3.png) | ![preview 4](21/preview_4.png) | ![preview 5](21/preview_5.png) | ![preview 6](21/preview_6.png) | ![preview 7](21/preview_7.png) | ![preview 8](21/preview_8.png) | | noise | 63 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
Thanmay/truthful_qa_multiple_choice-hi
--- dataset_info: features: - name: question dtype: string - name: mc1_targets struct: - name: choices sequence: string - name: labels sequence: int32 - name: mc2_targets struct: - name: choices sequence: string - name: labels sequence: int32 - name: itv2 hi question dtype: string - name: itv2 hi mc1_targets struct: - name: choices sequence: string - name: labels sequence: int64 - name: itv2 hi mc2_targets struct: - name: choices sequence: string - name: labels sequence: int64 splits: - name: validation num_bytes: 2177577 num_examples: 817 download_size: 710790 dataset_size: 2177577 configs: - config_name: default data_files: - split: validation path: data/validation-* ---
scikit-learn/credit-card-clients
--- license: cc0-1.0 --- ## Default of Credit Card Clients Dataset The following was retrieved from [UCI machine learning repository](https://archive.ics.uci.edu/ml/datasets/default+of+credit+card+clients). **Dataset Information** This dataset contains information on default payments, demographic factors, credit data, history of payment, and bill statements of credit card clients in Taiwan from April 2005 to September 2005. **Content** There are 25 variables: - ID: ID of each client - LIMIT_BAL: Amount of given credit in NT dollars (includes individual and family/supplementary credit - SEX: Gender (1=male, 2=female) - EDUCATION: (1=graduate school, 2=university, 3=high school, 4=others, 5=unknown, 6=unknown) - MARRIAGE: Marital status (1=married, 2=single, 3=others) - AGE: Age in years - PAY_0: Repayment status in September, 2005 (-1=pay duly, 1=payment delay for one month, 2=payment delay for two months, … 8=payment delay for eight months, 9=payment delay for nine months and above) - PAY_2: Repayment status in August, 2005 (scale same as above) - PAY_3: Repayment status in July, 2005 (scale same as above) - PAY_4: Repayment status in June, 2005 (scale same as above) - PAY_5: Repayment status in May, 2005 (scale same as above) - PAY_6: Repayment status in April, 2005 (scale same as above) - BILL_AMT1: Amount of bill statement in September, 2005 (NT dollar) - BILL_AMT2: Amount of bill statement in August, 2005 (NT dollar) - BILL_AMT3: Amount of bill statement in July, 2005 (NT dollar) - BILL_AMT4: Amount of bill statement in June, 2005 (NT dollar) - BILL_AMT5: Amount of bill statement in May, 2005 (NT dollar) - BILL_AMT6: Amount of bill statement in April, 2005 (NT dollar) - PAY_AMT1: Amount of previous payment in September, 2005 (NT dollar) - PAY_AMT2: Amount of previous payment in August, 2005 (NT dollar) - PAY_AMT3: Amount of previous payment in July, 2005 (NT dollar) - PAY_AMT4: Amount of previous payment in June, 2005 (NT dollar) - PAY_AMT5: Amount of previous payment in May, 2005 (NT dollar) - PAY_AMT6: Amount of previous payment in April, 2005 (NT dollar) - default.payment.next.month: Default payment (1=yes, 0=no) **Inspiration** Some ideas for exploration: How does the probability of default payment vary by categories of different demographic variables? Which variables are the strongest predictors of default payment? **Acknowledgements** Any publications based on this dataset should acknowledge the following: Lichman, M. (2013). UCI Machine Learning Repository [http://archive.ics.uci.edu/ml]. Irvine, CA: University of California, School of Information and Computer Science.
Falah/chapter7_1_prompts
--- dataset_info: features: - name: prompts dtype: string splits: - name: train num_bytes: 3050 num_examples: 10 download_size: 3219 dataset_size: 3050 --- # Dataset Card for "chapter7_1_prompts" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
LucasThil/miniwob_plusplus_T5_randomized_ref2
--- dataset_info: features: - name: history_episodes dtype: string - name: instruction dtype: string - name: html_snippets dtype: string - name: actions dtype: string - name: refs dtype: int64 - name: keydown_texts dtype: string splits: - name: train num_bytes: 267456938 num_examples: 60321 download_size: 0 dataset_size: 267456938 --- # Dataset Card for "miniwob_plusplus_T5_randomized_ref2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
christykoh/imdb_zh
--- dataset_info: features: - name: text dtype: string - name: label dtype: class_label: names: '0': neg '1': pos splits: - name: train num_bytes: 18760648 num_examples: 25000 - name: test num_bytes: 18574771 num_examples: 25000 download_size: 23908717 dataset_size: 37335419 --- # Dataset Card for "imdb_zh" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Anas989898/DPO-datascience
--- dataset_info: features: - name: prompt dtype: string - name: regected dtype: string - name: chosen dtype: string splits: - name: train num_bytes: 2507192 num_examples: 1096 - name: test num_bytes: 751956 num_examples: 300 download_size: 1510394 dataset_size: 3259148 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
CyberNative/Code_Vulnerability_Security_DPO
--- license: apache-2.0 tags: - dpo - cybersecurity - programming - code - Python pretty_name: Code Vulnerability and Security DPO Dataset --- # Cybernative.ai Code Vulnerability and Security Dataset ## Dataset Description The Cybernative.ai Code Vulnerability and Security Dataset is a dataset of synthetic Data Programming by Demonstration (DPO) pairs, focusing on the intricate relationship between secure and insecure code across a variety of programming languages. This dataset is meticulously crafted to serve as a pivotal resource for researchers, cybersecurity professionals, and AI developers who are keen on understanding, identifying, and mitigating vulnerabilities in code. This dataset is generated using [LoneStriker/deepseek-coder-33b-instruct-4.0bpw-h6-exl2](https://huggingface.co/LoneStriker/deepseek-coder-33b-instruct-4.0bpw-h6-exl2) ### Languages Covered The dataset spans an array of popular programming languages, including but not limited to: - C++ - Python - Java - JavaScript - C# - PHP - Ruby - Swift - Go - Kotlin - Fortran Each entry in the dataset is generated through a sophisticated AI-driven process, ensuring a diverse and realistic range of code examples. This approach guarantees that the dataset is not only extensive but also mirrors real-world coding practices and scenarios. ### Dataset Structure The dataset is organized into pairs of vulnerable and fixed code snippets, accompanied by a task description that serves as a question. This structure is designed to facilitate the development and evaluation of AI models capable of understanding and rectifying code vulnerabilities. - **Vulnerable Code**: A code snippet that contains a specific vulnerability, written in a professional, realistic manner but intentionally insecure and inefficient. - **Fixed Code**: A secure and optimized version of the vulnerable code, adhering to best practices and efficient methods. - **Task Description**: A high-level instruction that applies to both the vulnerable and fixed code, providing context and serving as a question for model evaluation. ### Use Cases The Cybernative.ai Code Vulnerability and Security Dataset is ideal for a variety of applications, including but not limited to: - Training AI models to identify code vulnerabilities. - Developing tools for automated code review and security auditing. - Enhancing educational resources for teaching secure coding practices. - Benchmarking the performance of code analysis and vulnerability detection algorithms. ### Accessing the Dataset The dataset is hosted on the Hugging Face Datasets platform, allowing for easy access and integration into machine learning workflows. Users can download the dataset directly from the platform and leverage its extensive tooling and community support for dataset manipulation and model training. ### Contributing Cybernative.ai encourages contributions to the dataset. Whether it's by submitting additional code pairs, suggesting improvements, or reporting issues, community involvement is pivotal in ensuring the dataset's quality and relevance. ### About Cybernative.ai Cybernative.ai is an AI Social Network dedicated to fostering innovation and collaboration in the field of artificial intelligence. By providing resources like the Code Vulnerability and Security Dataset, Cybernative.ai aims to empower developers, researchers, and enthusiasts to tackle the challenges of cybersecurity and AI development together. Join us in our mission to make the digital world more secure through the power of AI. Visit [Cybernative.ai](https://cybernative.ai) to explore more resources, connect with experts, and contribute to various AI and cybersecurity projects.
fw1zr/rahul-gandhi-captions
--- dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 4491114.0 num_examples: 116 download_size: 4452636 dataset_size: 4491114.0 --- # Dataset Card for "rahul-gandhi-captions" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
carnival13/eng_sur_DA_tokenized_rt5
--- dataset_info: features: - name: pass_label dtype: int64 - name: input_ids sequence: int32 - name: attention_mask sequence: int8 splits: - name: train num_bytes: 104310930 num_examples: 155590 download_size: 23898508 dataset_size: 104310930 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "eng_sur_DA_tokenized_rt5" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
distilled-from-one-sec-cv12/chunk_203
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 1219101468 num_examples: 237549 download_size: 1245791171 dataset_size: 1219101468 --- # Dataset Card for "chunk_203" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_julleong__illuni-llama-2-ko-7b-test
--- pretty_name: Evaluation run of julleong/illuni-llama-2-ko-7b-test dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [julleong/illuni-llama-2-ko-7b-test](https://huggingface.co/julleong/illuni-llama-2-ko-7b-test)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_julleong__illuni-llama-2-ko-7b-test\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-07T14:52:33.862107](https://huggingface.co/datasets/open-llm-leaderboard/details_julleong__illuni-llama-2-ko-7b-test/blob/main/results_2024-03-07T14-52-33.862107.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.29338568029421425,\n\ \ \"acc_stderr\": 0.032029488235203775,\n \"acc_norm\": 0.2955428748433477,\n\ \ \"acc_norm_stderr\": 0.03281437925902047,\n \"mc1\": 0.19951040391676866,\n\ \ \"mc1_stderr\": 0.013989929967559647,\n \"mc2\": 0.3329691460247487,\n\ \ \"mc2_stderr\": 0.014677158673168721\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.3916382252559727,\n \"acc_stderr\": 0.014264122124938213,\n\ \ \"acc_norm\": 0.43430034129692835,\n \"acc_norm_stderr\": 0.014484703048857357\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4887472615016929,\n\ \ \"acc_stderr\": 0.004988517597998613,\n \"acc_norm\": 0.6485759808803028,\n\ \ \"acc_norm_stderr\": 0.00476439398511103\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680814,\n \ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680814\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37037037037037035,\n\ \ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.37037037037037035,\n\ \ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.03459777606810538,\n\ \ \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.03459777606810538\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\ \ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \ \ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.2792452830188679,\n \"acc_stderr\": 0.02761116340239972,\n\ \ \"acc_norm\": 0.2792452830188679,\n \"acc_norm_stderr\": 0.02761116340239972\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n\ \ \"acc_stderr\": 0.03745554791462457,\n \"acc_norm\": 0.2777777777777778,\n\ \ \"acc_norm_stderr\": 0.03745554791462457\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n\ \ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \ \ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\ \ \"acc_stderr\": 0.03186209851641143,\n \"acc_norm\": 0.2254335260115607,\n\ \ \"acc_norm_stderr\": 0.03186209851641143\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493496,\n\ \ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493496\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\ \ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.3446808510638298,\n \"acc_stderr\": 0.03106898596312215,\n\ \ \"acc_norm\": 0.3446808510638298,\n \"acc_norm_stderr\": 0.03106898596312215\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\ \ \"acc_stderr\": 0.03999423879281335,\n \"acc_norm\": 0.23684210526315788,\n\ \ \"acc_norm_stderr\": 0.03999423879281335\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n\ \ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.2275132275132275,\n \"acc_stderr\": 0.021591269407823785,\n \"\ acc_norm\": 0.2275132275132275,\n \"acc_norm_stderr\": 0.021591269407823785\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\ \ \"acc_stderr\": 0.0339549002085611,\n \"acc_norm\": 0.1746031746031746,\n\ \ \"acc_norm_stderr\": 0.0339549002085611\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3,\n\ \ \"acc_stderr\": 0.02606936229533514,\n \"acc_norm\": 0.3,\n \ \ \"acc_norm_stderr\": 0.02606936229533514\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.031447125816782426,\n\ \ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.031447125816782426\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\ : 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.3212121212121212,\n \"acc_stderr\": 0.03646204963253812,\n\ \ \"acc_norm\": 0.3212121212121212,\n \"acc_norm_stderr\": 0.03646204963253812\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.31313131313131315,\n \"acc_stderr\": 0.033042050878136525,\n \"\ acc_norm\": 0.31313131313131315,\n \"acc_norm_stderr\": 0.033042050878136525\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.29015544041450775,\n \"acc_stderr\": 0.03275264467791516,\n\ \ \"acc_norm\": 0.29015544041450775,\n \"acc_norm_stderr\": 0.03275264467791516\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.24615384615384617,\n \"acc_stderr\": 0.021840866990423084,\n\ \ \"acc_norm\": 0.24615384615384617,\n \"acc_norm_stderr\": 0.021840866990423084\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844082,\n \ \ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844082\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.2815126050420168,\n \"acc_stderr\": 0.02921354941437217,\n \ \ \"acc_norm\": 0.2815126050420168,\n \"acc_norm_stderr\": 0.02921354941437217\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.24503311258278146,\n \"acc_stderr\": 0.035118075718047245,\n \"\ acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.035118075718047245\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.3614678899082569,\n \"acc_stderr\": 0.02059808200993737,\n \"\ acc_norm\": 0.3614678899082569,\n \"acc_norm_stderr\": 0.02059808200993737\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.23148148148148148,\n \"acc_stderr\": 0.028765111718046934,\n \"\ acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.028765111718046934\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"\ acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.2911392405063291,\n \"acc_stderr\": 0.029571601065753367,\n \ \ \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.029571601065753367\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3721973094170404,\n\ \ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.3721973094170404,\n\ \ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847835,\n\ \ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847835\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.38016528925619836,\n \"acc_stderr\": 0.04431324501968432,\n \"\ acc_norm\": 0.38016528925619836,\n \"acc_norm_stderr\": 0.04431324501968432\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3055555555555556,\n\ \ \"acc_stderr\": 0.04453197507374984,\n \"acc_norm\": 0.3055555555555556,\n\ \ \"acc_norm_stderr\": 0.04453197507374984\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.035590395316173425,\n\ \ \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.035590395316173425\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\ \ \"acc_stderr\": 0.042878587513404544,\n \"acc_norm\": 0.2857142857142857,\n\ \ \"acc_norm_stderr\": 0.042878587513404544\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.33980582524271846,\n \"acc_stderr\": 0.046897659372781335,\n\ \ \"acc_norm\": 0.33980582524271846,\n \"acc_norm_stderr\": 0.046897659372781335\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.37606837606837606,\n\ \ \"acc_stderr\": 0.031733936329694803,\n \"acc_norm\": 0.37606837606837606,\n\ \ \"acc_norm_stderr\": 0.031733936329694803\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3895274584929757,\n\ \ \"acc_stderr\": 0.0174380825562646,\n \"acc_norm\": 0.3895274584929757,\n\ \ \"acc_norm_stderr\": 0.0174380825562646\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.2861271676300578,\n \"acc_stderr\": 0.02433214677913413,\n\ \ \"acc_norm\": 0.2861271676300578,\n \"acc_norm_stderr\": 0.02433214677913413\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\ \ \"acc_stderr\": 0.014310999547961436,\n \"acc_norm\": 0.24134078212290502,\n\ \ \"acc_norm_stderr\": 0.014310999547961436\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.025553169991826514,\n\ \ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.025553169991826514\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3086816720257235,\n\ \ \"acc_stderr\": 0.026236965881153262,\n \"acc_norm\": 0.3086816720257235,\n\ \ \"acc_norm_stderr\": 0.026236965881153262\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.2839506172839506,\n \"acc_stderr\": 0.02508947852376513,\n\ \ \"acc_norm\": 0.2839506172839506,\n \"acc_norm_stderr\": 0.02508947852376513\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.28368794326241137,\n \"acc_stderr\": 0.02689170942834396,\n \ \ \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.02689170942834396\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2666232073011734,\n\ \ \"acc_stderr\": 0.011293836031612135,\n \"acc_norm\": 0.2666232073011734,\n\ \ \"acc_norm_stderr\": 0.011293836031612135\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.3014705882352941,\n \"acc_stderr\": 0.027875982114273168,\n\ \ \"acc_norm\": 0.3014705882352941,\n \"acc_norm_stderr\": 0.027875982114273168\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.29248366013071897,\n \"acc_stderr\": 0.01840341571010979,\n \ \ \"acc_norm\": 0.29248366013071897,\n \"acc_norm_stderr\": 0.01840341571010979\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.32727272727272727,\n\ \ \"acc_stderr\": 0.04494290866252088,\n \"acc_norm\": 0.32727272727272727,\n\ \ \"acc_norm_stderr\": 0.04494290866252088\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.22857142857142856,\n \"acc_stderr\": 0.02688214492230774,\n\ \ \"acc_norm\": 0.22857142857142856,\n \"acc_norm_stderr\": 0.02688214492230774\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.29850746268656714,\n\ \ \"acc_stderr\": 0.03235743789355041,\n \"acc_norm\": 0.29850746268656714,\n\ \ \"acc_norm_stderr\": 0.03235743789355041\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n\ \ \"acc_stderr\": 0.03664314777288086,\n \"acc_norm\": 0.3313253012048193,\n\ \ \"acc_norm_stderr\": 0.03664314777288086\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.38596491228070173,\n \"acc_stderr\": 0.03733756969066165,\n\ \ \"acc_norm\": 0.38596491228070173,\n \"acc_norm_stderr\": 0.03733756969066165\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.19951040391676866,\n\ \ \"mc1_stderr\": 0.013989929967559647,\n \"mc2\": 0.3329691460247487,\n\ \ \"mc2_stderr\": 0.014677158673168721\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.6377269139700079,\n \"acc_stderr\": 0.013508855476252508\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.024260803639120546,\n \ \ \"acc_stderr\": 0.004238007900001403\n }\n}\n```" repo_url: https://huggingface.co/julleong/illuni-llama-2-ko-7b-test leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|arc:challenge|25_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-07T14-52-33.862107.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|gsm8k|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hellaswag|10_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-07T14-52-33.862107.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-management|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T14-52-33.862107.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|truthfulqa:mc|0_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-07T14-52-33.862107.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_07T14_52_33.862107 path: - '**/details_harness|winogrande|5_2024-03-07T14-52-33.862107.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-07T14-52-33.862107.parquet' - config_name: results data_files: - split: 2024_03_07T14_52_33.862107 path: - results_2024-03-07T14-52-33.862107.parquet - split: latest path: - results_2024-03-07T14-52-33.862107.parquet --- # Dataset Card for Evaluation run of julleong/illuni-llama-2-ko-7b-test <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [julleong/illuni-llama-2-ko-7b-test](https://huggingface.co/julleong/illuni-llama-2-ko-7b-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_julleong__illuni-llama-2-ko-7b-test", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-07T14:52:33.862107](https://huggingface.co/datasets/open-llm-leaderboard/details_julleong__illuni-llama-2-ko-7b-test/blob/main/results_2024-03-07T14-52-33.862107.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.29338568029421425, "acc_stderr": 0.032029488235203775, "acc_norm": 0.2955428748433477, "acc_norm_stderr": 0.03281437925902047, "mc1": 0.19951040391676866, "mc1_stderr": 0.013989929967559647, "mc2": 0.3329691460247487, "mc2_stderr": 0.014677158673168721 }, "harness|arc:challenge|25": { "acc": 0.3916382252559727, "acc_stderr": 0.014264122124938213, "acc_norm": 0.43430034129692835, "acc_norm_stderr": 0.014484703048857357 }, "harness|hellaswag|10": { "acc": 0.4887472615016929, "acc_stderr": 0.004988517597998613, "acc_norm": 0.6485759808803028, "acc_norm_stderr": 0.00476439398511103 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.26, "acc_stderr": 0.044084400227680814, "acc_norm": 0.26, "acc_norm_stderr": 0.044084400227680814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.37037037037037035, "acc_stderr": 0.041716541613545426, "acc_norm": 0.37037037037037035, "acc_norm_stderr": 0.041716541613545426 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.23684210526315788, "acc_stderr": 0.03459777606810538, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.03459777606810538 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2792452830188679, "acc_stderr": 0.02761116340239972, "acc_norm": 0.2792452830188679, "acc_norm_stderr": 0.02761116340239972 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2777777777777778, "acc_stderr": 0.03745554791462457, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.03745554791462457 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.24, "acc_stderr": 0.04292346959909283, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.26, "acc_stderr": 0.04408440022768079, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.23, "acc_stderr": 0.04229525846816508, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816508 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2254335260115607, "acc_stderr": 0.03186209851641143, "acc_norm": 0.2254335260115607, "acc_norm_stderr": 0.03186209851641143 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.18627450980392157, "acc_stderr": 0.038739587141493496, "acc_norm": 0.18627450980392157, "acc_norm_stderr": 0.038739587141493496 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3446808510638298, "acc_stderr": 0.03106898596312215, "acc_norm": 0.3446808510638298, "acc_norm_stderr": 0.03106898596312215 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.03999423879281335, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.03999423879281335 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2689655172413793, "acc_stderr": 0.03695183311650232, "acc_norm": 0.2689655172413793, "acc_norm_stderr": 0.03695183311650232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2275132275132275, "acc_stderr": 0.021591269407823785, "acc_norm": 0.2275132275132275, "acc_norm_stderr": 0.021591269407823785 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.1746031746031746, "acc_stderr": 0.0339549002085611, "acc_norm": 0.1746031746031746, "acc_norm_stderr": 0.0339549002085611 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.24, "acc_stderr": 0.04292346959909284, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3, "acc_stderr": 0.02606936229533514, "acc_norm": 0.3, "acc_norm_stderr": 0.02606936229533514 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.27586206896551724, "acc_stderr": 0.031447125816782426, "acc_norm": 0.27586206896551724, "acc_norm_stderr": 0.031447125816782426 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.3212121212121212, "acc_stderr": 0.03646204963253812, "acc_norm": 0.3212121212121212, "acc_norm_stderr": 0.03646204963253812 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.31313131313131315, "acc_stderr": 0.033042050878136525, "acc_norm": 0.31313131313131315, "acc_norm_stderr": 0.033042050878136525 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.29015544041450775, "acc_stderr": 0.03275264467791516, "acc_norm": 0.29015544041450775, "acc_norm_stderr": 0.03275264467791516 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.24615384615384617, "acc_stderr": 0.021840866990423084, "acc_norm": 0.24615384615384617, "acc_norm_stderr": 0.021840866990423084 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25555555555555554, "acc_stderr": 0.026593939101844082, "acc_norm": 0.25555555555555554, "acc_norm_stderr": 0.026593939101844082 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2815126050420168, "acc_stderr": 0.02921354941437217, "acc_norm": 0.2815126050420168, "acc_norm_stderr": 0.02921354941437217 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.24503311258278146, "acc_stderr": 0.035118075718047245, "acc_norm": 0.24503311258278146, "acc_norm_stderr": 0.035118075718047245 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.3614678899082569, "acc_stderr": 0.02059808200993737, "acc_norm": 0.3614678899082569, "acc_norm_stderr": 0.02059808200993737 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.23148148148148148, "acc_stderr": 0.028765111718046934, "acc_norm": 0.23148148148148148, "acc_norm_stderr": 0.028765111718046934 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.24019607843137256, "acc_stderr": 0.02998373305591361, "acc_norm": 0.24019607843137256, "acc_norm_stderr": 0.02998373305591361 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.2911392405063291, "acc_stderr": 0.029571601065753367, "acc_norm": 0.2911392405063291, "acc_norm_stderr": 0.029571601065753367 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.3721973094170404, "acc_stderr": 0.03244305283008731, "acc_norm": 0.3721973094170404, "acc_norm_stderr": 0.03244305283008731 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2748091603053435, "acc_stderr": 0.03915345408847835, "acc_norm": 0.2748091603053435, "acc_norm_stderr": 0.03915345408847835 }, "harness|hendrycksTest-international_law|5": { "acc": 0.38016528925619836, "acc_stderr": 0.04431324501968432, "acc_norm": 0.38016528925619836, "acc_norm_stderr": 0.04431324501968432 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.3055555555555556, "acc_stderr": 0.04453197507374984, "acc_norm": 0.3055555555555556, "acc_norm_stderr": 0.04453197507374984 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.2883435582822086, "acc_stderr": 0.035590395316173425, "acc_norm": 0.2883435582822086, "acc_norm_stderr": 0.035590395316173425 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.2857142857142857, "acc_stderr": 0.042878587513404544, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.042878587513404544 }, "harness|hendrycksTest-management|5": { "acc": 0.33980582524271846, "acc_stderr": 0.046897659372781335, "acc_norm": 0.33980582524271846, "acc_norm_stderr": 0.046897659372781335 }, "harness|hendrycksTest-marketing|5": { "acc": 0.37606837606837606, "acc_stderr": 0.031733936329694803, "acc_norm": 0.37606837606837606, "acc_norm_stderr": 0.031733936329694803 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.24, "acc_stderr": 0.042923469599092816, "acc_norm": 0.24, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.3895274584929757, "acc_stderr": 0.0174380825562646, "acc_norm": 0.3895274584929757, "acc_norm_stderr": 0.0174380825562646 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.2861271676300578, "acc_stderr": 0.02433214677913413, "acc_norm": 0.2861271676300578, "acc_norm_stderr": 0.02433214677913413 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24134078212290502, "acc_stderr": 0.014310999547961436, "acc_norm": 0.24134078212290502, "acc_norm_stderr": 0.014310999547961436 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.27450980392156865, "acc_stderr": 0.025553169991826514, "acc_norm": 0.27450980392156865, "acc_norm_stderr": 0.025553169991826514 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.3086816720257235, "acc_stderr": 0.026236965881153262, "acc_norm": 0.3086816720257235, "acc_norm_stderr": 0.026236965881153262 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2839506172839506, "acc_stderr": 0.02508947852376513, "acc_norm": 0.2839506172839506, "acc_norm_stderr": 0.02508947852376513 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.28368794326241137, "acc_stderr": 0.02689170942834396, "acc_norm": 0.28368794326241137, "acc_norm_stderr": 0.02689170942834396 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2666232073011734, "acc_stderr": 0.011293836031612135, "acc_norm": 0.2666232073011734, "acc_norm_stderr": 0.011293836031612135 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.3014705882352941, "acc_stderr": 0.027875982114273168, "acc_norm": 0.3014705882352941, "acc_norm_stderr": 0.027875982114273168 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.29248366013071897, "acc_stderr": 0.01840341571010979, "acc_norm": 0.29248366013071897, "acc_norm_stderr": 0.01840341571010979 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.32727272727272727, "acc_stderr": 0.04494290866252088, "acc_norm": 0.32727272727272727, "acc_norm_stderr": 0.04494290866252088 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.22857142857142856, "acc_stderr": 0.02688214492230774, "acc_norm": 0.22857142857142856, "acc_norm_stderr": 0.02688214492230774 }, "harness|hendrycksTest-sociology|5": { "acc": 0.29850746268656714, "acc_stderr": 0.03235743789355041, "acc_norm": 0.29850746268656714, "acc_norm_stderr": 0.03235743789355041 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-virology|5": { "acc": 0.3313253012048193, "acc_stderr": 0.03664314777288086, "acc_norm": 0.3313253012048193, "acc_norm_stderr": 0.03664314777288086 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.38596491228070173, "acc_stderr": 0.03733756969066165, "acc_norm": 0.38596491228070173, "acc_norm_stderr": 0.03733756969066165 }, "harness|truthfulqa:mc|0": { "mc1": 0.19951040391676866, "mc1_stderr": 0.013989929967559647, "mc2": 0.3329691460247487, "mc2_stderr": 0.014677158673168721 }, "harness|winogrande|5": { "acc": 0.6377269139700079, "acc_stderr": 0.013508855476252508 }, "harness|gsm8k|5": { "acc": 0.024260803639120546, "acc_stderr": 0.004238007900001403 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
SiguienteGlobal/linguistica_assist
--- license: apache-2.0 language: - es tags: - code pretty_name: linguistica_assist size_categories: - 10K<n<100K --- # Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
Nexdata/Interspeech2020_Accented_English_Speech_Recognition_Competition_Data
--- YAML tags: - copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging --- # Dataset Card for Nexdata/Interspeech2020_Accented_English_Speech_Recognition_Competition_Data ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** https://www.nexdata.ai/datasets/1169?source=Huggingface - **Repository:** - **Paper:** - **Leaderboard:** - **Point of Contact:** ### Dataset Summary Interspeech2,020 Accented English Speech Recognition Competition Data. The text has been proofread manually with high accuracy; this data set can be used for automatic speech recognition, machine translation, and voiceprint recognition. For more details, please refer to the link: https://www.nexdata.ai/datasets/1169?source=Huggingface ### Supported Tasks and Leaderboards automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR). ### Languages Accented English ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing ### Citation Information [More Information Needed] ### Contributions
MoritzLaurer/dataset_test_concat_nli
--- dataset_info: features: - name: text dtype: string - name: hypothesis dtype: string - name: labels dtype: class_label: names: '0': entailment '1': not_entailment - name: task_name dtype: string splits: - name: train num_bytes: 15114416 num_examples: 59140 download_size: 8715544 dataset_size: 15114416 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "dataset_test_concat_nli" Dataset for testing a universal classifier. Additional information and training code available here: https://github.com/MoritzLaurer/zeroshot-classifier
tohoku-nlp/multi-vidsum-eval
--- license: apache-2.0 ---
bigscience-data/roots_id_ted_talks_iwslt
--- language: id license: cc-by-nc-nd-4.0 extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience Ethical Charter. The charter can be found at: https://hf.co/spaces/bigscience/ethical-charter' extra_gated_fields: I have read and agree to abide by the BigScience Ethical Charter: checkbox --- ROOTS Subset: roots_id_ted_talks_iwslt # WIT Ted Talks - Dataset uid: `ted_talks_iwslt` ### Description The Web Inventory Talk is a collection of the original Ted talks and their translated version. The translations are available in more than 109+ languages, though the distribution is not uniform. ### Homepage https://github.com/huggingface/datasets/blob/master/datasets/ted_talks_iwslt/README.md ### Licensing - open license - cc-by-nc-4.0: Creative Commons Attribution Non Commercial 4.0 International TED makes its collection of video recordings and transcripts of talks available under the Creative Commons BY-NC-ND license (look here). WIT3 acknowledges the authorship of TED talks (BY condition) and does not redistribute transcripts for commercial purposes (NC). As regards the integrity of the work (ND), WIT3 only changes the format of the container, while preserving the original contents. WIT3 aims to support research on human language processing as well as the diffusion of TED Talks! ### Speaker Locations - Southern Europe - Italy ### Sizes - 0.0305 % of total - 0.0736 % of ar - 0.2002 % of pt - 0.0128 % of zh - 0.2236 % of vi - 0.0330 % of fr - 0.0545 % of es - 0.0122 % of en - 0.3704 % of id - 0.0373 % of indic-hi - 0.0330 % of indic-ta - 0.1393 % of indic-mr - 0.0305 % of ca - 0.1179 % of indic-ur - 0.0147 % of indic-bn - 0.0240 % of indic-ml - 0.0244 % of indic-te - 0.0503 % of indic-gu - 0.0211 % of indic-kn - 0.0274 % of eu - 0.0023 % of indic-as - 0.0001 % of indic-pa ### BigScience processing steps #### Filters applied to: ar - dedup_document - dedup_template_soft - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: pt - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: zh - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_1024 #### Filters applied to: vi - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: fr - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_1024 #### Filters applied to: es - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_1024 #### Filters applied to: en - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_1024 #### Filters applied to: id - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: indic-hi - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: indic-ta - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: indic-mr - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: ca - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_1024 #### Filters applied to: indic-ur - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: indic-bn - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: indic-ml - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: indic-te - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: indic-gu - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: indic-kn - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: eu - dedup_document - filter_remove_empty_docs #### Filters applied to: indic-as - dedup_document - filter_remove_empty_docs #### Filters applied to: indic-pa - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300
open-llm-leaderboard/details_juhwanlee__experiment2-cause-v1
--- pretty_name: Evaluation run of juhwanlee/experiment2-cause-v1 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [juhwanlee/experiment2-cause-v1](https://huggingface.co/juhwanlee/experiment2-cause-v1)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_juhwanlee__experiment2-cause-v1\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-05T05:00:39.006892](https://huggingface.co/datasets/open-llm-leaderboard/details_juhwanlee__experiment2-cause-v1/blob/main/results_2024-03-05T05-00-39.006892.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6347236795836808,\n\ \ \"acc_stderr\": 0.03239027557700436,\n \"acc_norm\": 0.6403177585491361,\n\ \ \"acc_norm_stderr\": 0.033044435643731676,\n \"mc1\": 0.32313341493268055,\n\ \ \"mc1_stderr\": 0.016371836286454604,\n \"mc2\": 0.4719694152855096,\n\ \ \"mc2_stderr\": 0.014750153145318967\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5716723549488054,\n \"acc_stderr\": 0.014460496367599012,\n\ \ \"acc_norm\": 0.6100682593856656,\n \"acc_norm_stderr\": 0.014252959848892893\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6318462457677754,\n\ \ \"acc_stderr\": 0.004813177057496268,\n \"acc_norm\": 0.8337980481975702,\n\ \ \"acc_norm_stderr\": 0.003715010224478618\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\ \ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\ \ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n\ \ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\ \ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \ \ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\ \ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\ \ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\ \ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \ \ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\ \ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\ \ \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n\ \ \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\ \ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\ \ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\ \ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\ \ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n\ \ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\ \ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137602,\n \"\ acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137602\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\ \ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\ \ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n\ \ \"acc_stderr\": 0.02436259969303109,\n \"acc_norm\": 0.7580645161290323,\n\ \ \"acc_norm_stderr\": 0.02436259969303109\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\ \ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\ : 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\ \ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\ acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\ \ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\ \ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \ \ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977924,\n\ \ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977924\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\ acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431388,\n \"\ acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431388\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\ acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n \"\ acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \ \ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\ \ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\ \ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\ \ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\ acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\ \ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\ \ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\ \ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\ \ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\ \ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\ \ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\ \ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\ \ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \ \ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\ \ \"acc_stderr\": 0.013778693778464074,\n \"acc_norm\": 0.8186462324393359,\n\ \ \"acc_norm_stderr\": 0.013778693778464074\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n\ \ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3787709497206704,\n\ \ \"acc_stderr\": 0.01622353351036511,\n \"acc_norm\": 0.3787709497206704,\n\ \ \"acc_norm_stderr\": 0.01622353351036511\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\ \ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\ \ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\ \ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n\ \ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \ \ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n\ \ \"acc_stderr\": 0.01268781841959992,\n \"acc_norm\": 0.44328552803129073,\n\ \ \"acc_norm_stderr\": 0.01268781841959992\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.028888193103988633,\n\ \ \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.028888193103988633\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6568627450980392,\n \"acc_stderr\": 0.019206606848825362,\n \ \ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.019206606848825362\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\ \ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\ \ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304335,\n\ \ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304335\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\ \ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\ \ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \ \ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\ \ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\ \ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n\ \ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32313341493268055,\n\ \ \"mc1_stderr\": 0.016371836286454604,\n \"mc2\": 0.4719694152855096,\n\ \ \"mc2_stderr\": 0.014750153145318967\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7900552486187845,\n \"acc_stderr\": 0.01144628062926263\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.38968915845337376,\n \ \ \"acc_stderr\": 0.013433123236110702\n }\n}\n```" repo_url: https://huggingface.co/juhwanlee/experiment2-cause-v1 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|arc:challenge|25_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-05T05-00-39.006892.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|gsm8k|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hellaswag|10_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-05T05-00-39.006892.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-management|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T05-00-39.006892.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|truthfulqa:mc|0_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-05T05-00-39.006892.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_05T05_00_39.006892 path: - '**/details_harness|winogrande|5_2024-03-05T05-00-39.006892.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-05T05-00-39.006892.parquet' - config_name: results data_files: - split: 2024_03_05T05_00_39.006892 path: - results_2024-03-05T05-00-39.006892.parquet - split: latest path: - results_2024-03-05T05-00-39.006892.parquet --- # Dataset Card for Evaluation run of juhwanlee/experiment2-cause-v1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [juhwanlee/experiment2-cause-v1](https://huggingface.co/juhwanlee/experiment2-cause-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_juhwanlee__experiment2-cause-v1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-05T05:00:39.006892](https://huggingface.co/datasets/open-llm-leaderboard/details_juhwanlee__experiment2-cause-v1/blob/main/results_2024-03-05T05-00-39.006892.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6347236795836808, "acc_stderr": 0.03239027557700436, "acc_norm": 0.6403177585491361, "acc_norm_stderr": 0.033044435643731676, "mc1": 0.32313341493268055, "mc1_stderr": 0.016371836286454604, "mc2": 0.4719694152855096, "mc2_stderr": 0.014750153145318967 }, "harness|arc:challenge|25": { "acc": 0.5716723549488054, "acc_stderr": 0.014460496367599012, "acc_norm": 0.6100682593856656, "acc_norm_stderr": 0.014252959848892893 }, "harness|hellaswag|10": { "acc": 0.6318462457677754, "acc_stderr": 0.004813177057496268, "acc_norm": 0.8337980481975702, "acc_norm_stderr": 0.003715010224478618 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6074074074074074, "acc_stderr": 0.0421850621536888, "acc_norm": 0.6074074074074074, "acc_norm_stderr": 0.0421850621536888 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6513157894736842, "acc_stderr": 0.0387813988879761, "acc_norm": 0.6513157894736842, "acc_norm_stderr": 0.0387813988879761 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.02804918631569525, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.02804918631569525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7013888888888888, "acc_stderr": 0.03827052357950756, "acc_norm": 0.7013888888888888, "acc_norm_stderr": 0.03827052357950756 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6358381502890174, "acc_stderr": 0.03669072477416906, "acc_norm": 0.6358381502890174, "acc_norm_stderr": 0.03669072477416906 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.047028804320496165, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.047028804320496165 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.38095238095238093, "acc_stderr": 0.025010749116137602, "acc_norm": 0.38095238095238093, "acc_norm_stderr": 0.025010749116137602 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3968253968253968, "acc_stderr": 0.043758884927270605, "acc_norm": 0.3968253968253968, "acc_norm_stderr": 0.043758884927270605 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7580645161290323, "acc_stderr": 0.02436259969303109, "acc_norm": 0.7580645161290323, "acc_norm_stderr": 0.02436259969303109 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7454545454545455, "acc_stderr": 0.03401506715249039, "acc_norm": 0.7454545454545455, "acc_norm_stderr": 0.03401506715249039 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586815, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586815 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8704663212435233, "acc_stderr": 0.024233532297758733, "acc_norm": 0.8704663212435233, "acc_norm_stderr": 0.024233532297758733 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6487179487179487, "acc_stderr": 0.024203665177902803, "acc_norm": 0.6487179487179487, "acc_norm_stderr": 0.024203665177902803 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35555555555555557, "acc_stderr": 0.02918571494985741, "acc_norm": 0.35555555555555557, "acc_norm_stderr": 0.02918571494985741 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.030066761582977924, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.030066761582977924 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8238532110091743, "acc_stderr": 0.016332882393431388, "acc_norm": 0.8238532110091743, "acc_norm_stderr": 0.016332882393431388 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5138888888888888, "acc_stderr": 0.03408655867977749, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.03408655867977749 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7843137254901961, "acc_stderr": 0.028867431449849316, "acc_norm": 0.7843137254901961, "acc_norm_stderr": 0.028867431449849316 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.0263616516683891, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.0263616516683891 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8148148148148148, "acc_stderr": 0.03755265865037181, "acc_norm": 0.8148148148148148, "acc_norm_stderr": 0.03755265865037181 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7852760736196319, "acc_stderr": 0.03226219377286775, "acc_norm": 0.7852760736196319, "acc_norm_stderr": 0.03226219377286775 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.8058252427184466, "acc_stderr": 0.03916667762822584, "acc_norm": 0.8058252427184466, "acc_norm_stderr": 0.03916667762822584 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8888888888888888, "acc_stderr": 0.020588491316092368, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.020588491316092368 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8186462324393359, "acc_stderr": 0.013778693778464074, "acc_norm": 0.8186462324393359, "acc_norm_stderr": 0.013778693778464074 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7167630057803468, "acc_stderr": 0.024257901705323378, "acc_norm": 0.7167630057803468, "acc_norm_stderr": 0.024257901705323378 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3787709497206704, "acc_stderr": 0.01622353351036511, "acc_norm": 0.3787709497206704, "acc_norm_stderr": 0.01622353351036511 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7581699346405228, "acc_stderr": 0.024518195641879334, "acc_norm": 0.7581699346405228, "acc_norm_stderr": 0.024518195641879334 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7170418006430869, "acc_stderr": 0.02558306248998481, "acc_norm": 0.7170418006430869, "acc_norm_stderr": 0.02558306248998481 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7314814814814815, "acc_stderr": 0.024659685185967284, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.024659685185967284 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4929078014184397, "acc_stderr": 0.02982449855912901, "acc_norm": 0.4929078014184397, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.44328552803129073, "acc_stderr": 0.01268781841959992, "acc_norm": 0.44328552803129073, "acc_norm_stderr": 0.01268781841959992 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6544117647058824, "acc_stderr": 0.028888193103988633, "acc_norm": 0.6544117647058824, "acc_norm_stderr": 0.028888193103988633 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6568627450980392, "acc_stderr": 0.019206606848825362, "acc_norm": 0.6568627450980392, "acc_norm_stderr": 0.019206606848825362 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.710204081632653, "acc_stderr": 0.029043088683304335, "acc_norm": 0.710204081632653, "acc_norm_stderr": 0.029043088683304335 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8606965174129353, "acc_stderr": 0.024484487162913973, "acc_norm": 0.8606965174129353, "acc_norm_stderr": 0.024484487162913973 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774709, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-virology|5": { "acc": 0.5180722891566265, "acc_stderr": 0.03889951252827216, "acc_norm": 0.5180722891566265, "acc_norm_stderr": 0.03889951252827216 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727668, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727668 }, "harness|truthfulqa:mc|0": { "mc1": 0.32313341493268055, "mc1_stderr": 0.016371836286454604, "mc2": 0.4719694152855096, "mc2_stderr": 0.014750153145318967 }, "harness|winogrande|5": { "acc": 0.7900552486187845, "acc_stderr": 0.01144628062926263 }, "harness|gsm8k|5": { "acc": 0.38968915845337376, "acc_stderr": 0.013433123236110702 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
chenbobo/chat
--- license: unlicense task_chat: - null task_categories: - text-generation language: - zh tags: - finance pretty_name: tyin_demo size_categories: - n<1K ---
Pablao0948/Dark_Giovanni
--- license: openrail ---
imoxto/prompt_injection_cleaned_dataset-v2
--- dataset_info: features: - name: model dtype: string - name: text dtype: string - name: labels dtype: int64 splits: - name: train num_bytes: 670958021 num_examples: 535105 download_size: 79246765 dataset_size: 670958021 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "prompt_injection_cleaned_dataset-v2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
HamdanXI/paradetox-1Token-Split-MASK
--- dataset_info: features: - name: labels dtype: string - name: input dtype: string splits: - name: train num_bytes: 413629 num_examples: 3784 - name: validation num_bytes: 88206 num_examples: 811 - name: test num_bytes: 86840 num_examples: 811 download_size: 390168 dataset_size: 588675 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* ---
learn3r/summ_screen_fd_memsum_bp
--- dataset_info: features: - name: input dtype: string - name: output dtype: string splits: - name: train num_bytes: 7002624 num_examples: 3673 - name: validation num_bytes: 676928 num_examples: 338 - name: test num_bytes: 717198 num_examples: 337 download_size: 410312 dataset_size: 8396750 --- # Dataset Card for "summ_screen_fd_memsum_bp" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Edsodre/mari2
--- license: openrail ---
Alexisnlxoekdk/MCkevindataset
--- license: openrail ---
gvlassis/shakespearefirstfolio
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 4077942 num_examples: 30 - name: validation num_bytes: 245785 num_examples: 2 - name: test num_bytes: 506679 num_examples: 4 download_size: 3073023 dataset_size: 4830406 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* task_categories: - text-generation language: - en tags: - shakespeare size_categories: - n<1K --- # shakespearefirstfolio ## About 🎭 Shakespeare's First Folio (a collection of 36 of Shakespeare's plays) as a Hugging Face dataset! ## Description In 2015, Andrej Karpathy wrote a post called "The Unreasonable Effectiveness of Recurrent Neural Networks" in his blog. For the needs of this post, he created tinyshakespeare, a subset of Shakespeare's works in a single 40,000 lines file. Surprisingly, language models trained from scratch on this tiny dataset can produce samples that look very close to those written by Shakespeare himself. Since then, tinyshakespeare has been the defacto dataset used as a first test while developing language models. Unfortunately, it has some problems: 1) It is a single file, which makes further processing difficult 2) It does not contain all of Shakespeare's works 3) It is not clear exactly what works and to what extend are included This dataset tries to address these problems. It is ~4 times bigger than tinyshakespeare. It was manually collected from [Folger Shakespeare Library](https://www.folger.edu/). ## Usage import datasets dataset = datasets.load_dataset("gvlassis/shakespearefirstfolio")
open-llm-leaderboard/details_migtissera__Synthia-7B
--- pretty_name: Evaluation run of migtissera/Synthia-7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [migtissera/Synthia-7B](https://huggingface.co/migtissera/Synthia-7B) on the [Open\ \ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Synthia-7B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-15T06:07:54.738296](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-7B/blob/main/results_2023-10-15T06-07-54.738296.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.07151845637583892,\n\ \ \"em_stderr\": 0.00263897548039012,\n \"f1\": 0.14513737416107345,\n\ \ \"f1_stderr\": 0.0029452435334875074,\n \"acc\": 0.4043291747772373,\n\ \ \"acc_stderr\": 0.009561470405449964\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.07151845637583892,\n \"em_stderr\": 0.00263897548039012,\n\ \ \"f1\": 0.14513737416107345,\n \"f1_stderr\": 0.0029452435334875074\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06595905989385899,\n \ \ \"acc_stderr\": 0.006836951192034222\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7426992896606156,\n \"acc_stderr\": 0.012285989618865708\n\ \ }\n}\n```" repo_url: https://huggingface.co/migtissera/Synthia-7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|arc:challenge|25_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-08-17T17:21:07.158534.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_15T06_07_54.738296 path: - '**/details_harness|drop|3_2023-10-15T06-07-54.738296.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-15T06-07-54.738296.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_15T06_07_54.738296 path: - '**/details_harness|gsm8k|5_2023-10-15T06-07-54.738296.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-15T06-07-54.738296.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hellaswag|10_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-17T17:21:07.158534.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-management|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-virology|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T17:21:07.158534.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_08_17T17_21_07.158534 path: - '**/details_harness|truthfulqa:mc|0_2023-08-17T17:21:07.158534.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-08-17T17:21:07.158534.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_15T06_07_54.738296 path: - '**/details_harness|winogrande|5_2023-10-15T06-07-54.738296.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-15T06-07-54.738296.parquet' - config_name: results data_files: - split: 2023_08_17T17_21_07.158534 path: - results_2023-08-17T17:21:07.158534.parquet - split: 2023_10_15T06_07_54.738296 path: - results_2023-10-15T06-07-54.738296.parquet - split: latest path: - results_2023-10-15T06-07-54.738296.parquet --- # Dataset Card for Evaluation run of migtissera/Synthia-7B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/migtissera/Synthia-7B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [migtissera/Synthia-7B](https://huggingface.co/migtissera/Synthia-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_migtissera__Synthia-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T06:07:54.738296](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-7B/blob/main/results_2023-10-15T06-07-54.738296.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.07151845637583892, "em_stderr": 0.00263897548039012, "f1": 0.14513737416107345, "f1_stderr": 0.0029452435334875074, "acc": 0.4043291747772373, "acc_stderr": 0.009561470405449964 }, "harness|drop|3": { "em": 0.07151845637583892, "em_stderr": 0.00263897548039012, "f1": 0.14513737416107345, "f1_stderr": 0.0029452435334875074 }, "harness|gsm8k|5": { "acc": 0.06595905989385899, "acc_stderr": 0.006836951192034222 }, "harness|winogrande|5": { "acc": 0.7426992896606156, "acc_stderr": 0.012285989618865708 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
yuvalkirstain/pexel_images
--- dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 27590932.0 num_examples: 80 download_size: 27589857 dataset_size: 27590932.0 --- # Dataset Card for "pexel_images" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Afjalru/loan-prediction
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 1654448 num_examples: 1000 download_size: 966693 dataset_size: 1654448 configs: - config_name: default data_files: - split: train path: data/train-* --- # Guanaco-1k: Lazy Llama 2 Formatting This is a subset (1000 samples) of the excellent [`timdettmers/openassistant-guanaco`](https://huggingface.co/datasets/timdettmers/openassistant-guanaco) dataset, processed to match Llama 2's prompt format as described [in this article](https://huggingface.co/blog/llama2#how-to-prompt-llama-2). It was created using the following [colab notebook](https://colab.research.google.com/drive/1Ad7a9zMmkxuXTOh1Z7-rNSICA4dybpM2?usp=sharing). Useful if you don't want to reformat it by yourself (e.g., using a script). It was designed for [this article](https://mlabonne.github.io/blog/posts/Fine_Tune_Your_Own_Llama_2_Model_in_a_Colab_Notebook.html) about fine-tuning a Llama 2 (chat) model in a Google Colab.
pradeep239/phil_Image_250Pdfs
--- license: mit dataset_info: features: - name: image dtype: image - name: ground_truth dtype: string splits: - name: train num_bytes: 444300577.0 num_examples: 867 - name: validation num_bytes: 52961081.0 num_examples: 102 - name: test num_bytes: 25115003.0 num_examples: 51 download_size: 434069431 dataset_size: 522376661.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* ---
jasonzsxie/my_dataset
--- dataset_info: features: - name: audio dtype: audio splits: - name: train num_bytes: 64049.0 num_examples: 1 download_size: 65151 dataset_size: 64049.0 configs: - config_name: default data_files: - split: train path: data/train-* ---
pgwi/clean_fashion_data
--- license: apache-2.0 ---
language-plus-molecules/LPM-24_train-extra
--- dataset_info: features: - name: molecule dtype: string - name: caption dtype: string splits: - name: train num_bytes: 276260470 num_examples: 802800 - name: split_train num_bytes: 219611475 num_examples: 634320 - name: split_valid num_bytes: 56648995 num_examples: 168480 download_size: 78056020 dataset_size: 552520940 configs: - config_name: default data_files: - split: train path: data/train-* - split: split_train path: data/split_train-* - split: split_valid path: data/split_valid-* ---
correll/semanticsegmentationandposeestimationfromrgbd
--- license: mit task_categories: - image-segmentation - image-classification - object-detection pretty_name: Semantic segmenation and pose estimation from RGB-D dataset_info: features: - name: rgb dtype: image - name: depth dtype: image - name: mask dtype: image - name: meta list: - name: colors sequence: float64 - name: file dtype: string - name: id dtype: int64 - name: model dtype: string - name: numberOfColors dtype: int64 - name: orientation sequence: float64 - name: position sequence: float64 - name: positionOnImage sequence: int64 - name: size sequence: float64 - name: sizeOnImage sequence: int64 splits: - name: train num_bytes: 3340733260.96 num_examples: 1106 download_size: 3319212411 dataset_size: 3340733260.96 configs: - config_name: default data_files: - split: train path: data/train-* --- RGB-D dataset for instance segmentation (from RGB or depth) and pose estimation of individual objects. Data has been generated by randomizing bin contents in Webots. Each instance contains a mask image as well meta data containing labels, position, and size of each object. <video src='https://cdn-uploads.huggingface.co/production/uploads/655b1b359d249b4ab388d4a2/l6b76ezxkPi6lG3Fr6_kj.mp4' width=720/> You can create your own data by opening webots_grasp.wbt in the world directory using [Webots](https://www.cyberbotics.com).
Gabriel1898/poze1
--- license: openrail ---
huggingartists/aikko
--- language: - en tags: - huggingartists - lyrics --- # Dataset Card for "huggingartists/aikko" ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [How to use](#how-to-use) - [Dataset Structure](#dataset-structure) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [About](#about) ## Dataset Description - **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists) - **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists) - **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Size of the generated dataset:** 1.029888 MB <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://images.genius.com/a1a40316d1405fa83df2a21923d64168.1000x1000x1.jpg&#39;)"> </div> </div> <a href="https://huggingface.co/huggingartists/aikko"> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div> </a> <div style="text-align: center; font-size: 16px; font-weight: 800">⁣aikko</div> <a href="https://genius.com/artists/aikko"> <div style="text-align: center; font-size: 14px;">@aikko</div> </a> </div> ### Dataset Summary The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists. Model is available [here](https://huggingface.co/huggingartists/aikko). ### Supported Tasks and Leaderboards [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Languages en ## How to use How to load this dataset directly with the datasets library: ```python from datasets import load_dataset dataset = load_dataset("huggingartists/aikko") ``` ## Dataset Structure An example of 'train' looks as follows. ``` This example was too long and was cropped: { "text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..." } ``` ### Data Fields The data fields are the same among all splits. - `text`: a `string` feature. ### Data Splits | train |validation|test| |------:|---------:|---:| |305| -| -| 'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code: ```python from datasets import load_dataset, Dataset, DatasetDict import numpy as np datasets = load_dataset("huggingartists/aikko") train_percentage = 0.9 validation_percentage = 0.07 test_percentage = 0.03 train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))]) datasets = DatasetDict( { 'train': Dataset.from_dict({'text': list(train)}), 'validation': Dataset.from_dict({'text': list(validation)}), 'test': Dataset.from_dict({'text': list(test)}) } ) ``` ## Dataset Creation ### Curation Rationale [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Source Data #### Initial Data Collection and Normalization [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the source language producers? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Annotations #### Annotation process [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the annotators? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Personal and Sensitive Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Discussion of Biases [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Other Known Limitations [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Additional Information ### Dataset Curators [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Licensing Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Citation Information ``` @InProceedings{huggingartists, author={Aleksey Korshuk} year=2021 } ``` ## About *Built by Aleksey Korshuk* [![Follow](https://img.shields.io/github/followers/AlekseyKorshuk?style=social)](https://github.com/AlekseyKorshuk) [![Follow](https://img.shields.io/twitter/follow/alekseykorshuk?style=social)](https://twitter.com/intent/follow?screen_name=alekseykorshuk) [![Follow](https://img.shields.io/badge/dynamic/json?color=blue&label=Telegram%20Channel&query=%24.result&url=https%3A%2F%2Fapi.telegram.org%2Fbot1929545866%3AAAFGhV-KKnegEcLiyYJxsc4zV6C-bdPEBtQ%2FgetChatMemberCount%3Fchat_id%3D-1001253621662&style=social&logo=telegram)](https://t.me/joinchat/_CQ04KjcJ-4yZTky) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/AlekseyKorshuk/huggingartists?style=social)](https://github.com/AlekseyKorshuk/huggingartists)
ppxscal/citation-network-v1-jaccard
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: node1 dtype: int64 - name: abstract1 dtype: string - name: node2 dtype: int64 - name: abstract2 dtype: string - name: jaccard_score dtype: float64 splits: - name: train num_bytes: 928245561 num_examples: 631592 download_size: 300134106 dataset_size: 928245561 --- # Dataset Card for "embeddings-network-jaccard" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
seongwoon/industry-occupation
--- license: cc-by-nc-nd-4.0 ---
open-llm-leaderboard/details_Severian__ANIMA-Nectar-v3
--- pretty_name: Evaluation run of Severian/ANIMA-Nectar-v3 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Severian/ANIMA-Nectar-v3](https://huggingface.co/Severian/ANIMA-Nectar-v3) on\ \ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Severian__ANIMA-Nectar-v3\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-12-09T16:02:02.105784](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Nectar-v3/blob/main/results_2023-12-09T16-02-02.105784.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5279231991868109,\n\ \ \"acc_stderr\": 0.034095739528534785,\n \"acc_norm\": 0.5365465936132023,\n\ \ \"acc_norm_stderr\": 0.0349316183151297,\n \"mc1\": 0.3108935128518972,\n\ \ \"mc1_stderr\": 0.016203316673559693,\n \"mc2\": 0.4616473915095851,\n\ \ \"mc2_stderr\": 0.014431098139511664\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.454778156996587,\n \"acc_stderr\": 0.014551507060836353,\n\ \ \"acc_norm\": 0.4948805460750853,\n \"acc_norm_stderr\": 0.014610624890309154\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5621390161322446,\n\ \ \"acc_stderr\": 0.004951097802775953,\n \"acc_norm\": 0.7599083847839075,\n\ \ \"acc_norm_stderr\": 0.004262659388824526\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\ \ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\ \ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\ \ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n\ \ \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\ \ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \ \ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n\ \ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n\ \ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.5416666666666666,\n\ \ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n\ \ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\ \ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\ \ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207763,\n\ \ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207763\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\ \ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\ \ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\ \ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\ \ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\ \ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"\ acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\ \ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\ \ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\ \ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6258064516129033,\n\ \ \"acc_stderr\": 0.027528904299845704,\n \"acc_norm\": 0.6258064516129033,\n\ \ \"acc_norm_stderr\": 0.027528904299845704\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486517,\n\ \ \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486517\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\ : 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\ \ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.6767676767676768,\n \"acc_stderr\": 0.033322999210706444,\n \"\ acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.033322999210706444\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7098445595854922,\n \"acc_stderr\": 0.032752644677915166,\n\ \ \"acc_norm\": 0.7098445595854922,\n \"acc_norm_stderr\": 0.032752644677915166\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.4641025641025641,\n \"acc_stderr\": 0.02528558599001784,\n \ \ \"acc_norm\": 0.4641025641025641,\n \"acc_norm_stderr\": 0.02528558599001784\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \ \ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.46218487394957986,\n \"acc_stderr\": 0.032385469487589795,\n\ \ \"acc_norm\": 0.46218487394957986,\n \"acc_norm_stderr\": 0.032385469487589795\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\ acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7045871559633028,\n \"acc_stderr\": 0.019560619182976,\n \"acc_norm\"\ : 0.7045871559633028,\n \"acc_norm_stderr\": 0.019560619182976\n },\n\ \ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3472222222222222,\n\ \ \"acc_stderr\": 0.032468872436376486,\n \"acc_norm\": 0.3472222222222222,\n\ \ \"acc_norm_stderr\": 0.032468872436376486\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\ : {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.034107853389047205,\n\ \ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.034107853389047205\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.6624472573839663,\n \"acc_stderr\": 0.030781549102026223,\n \ \ \"acc_norm\": 0.6624472573839663,\n \"acc_norm_stderr\": 0.030781549102026223\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\ \ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\ \ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969638,\n\ \ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969638\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.6776859504132231,\n \"acc_stderr\": 0.042664163633521685,\n \"\ acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.042664163633521685\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\ \ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n\ \ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\ \ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\ \ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\ \ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039476,\n\ \ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039476\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\ \ \"acc_stderr\": 0.025598193686652265,\n \"acc_norm\": 0.811965811965812,\n\ \ \"acc_norm_stderr\": 0.025598193686652265\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \ \ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7113665389527458,\n\ \ \"acc_stderr\": 0.016203792703197786,\n \"acc_norm\": 0.7113665389527458,\n\ \ \"acc_norm_stderr\": 0.016203792703197786\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.5346820809248555,\n \"acc_stderr\": 0.026854257928258893,\n\ \ \"acc_norm\": 0.5346820809248555,\n \"acc_norm_stderr\": 0.026854257928258893\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33631284916201115,\n\ \ \"acc_stderr\": 0.015801003729145894,\n \"acc_norm\": 0.33631284916201115,\n\ \ \"acc_norm_stderr\": 0.015801003729145894\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5228758169934641,\n \"acc_stderr\": 0.028599936776089782,\n\ \ \"acc_norm\": 0.5228758169934641,\n \"acc_norm_stderr\": 0.028599936776089782\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\ \ \"acc_stderr\": 0.027648149599751468,\n \"acc_norm\": 0.6141479099678456,\n\ \ \"acc_norm_stderr\": 0.027648149599751468\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.02691500301138016,\n\ \ \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.02691500301138016\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199495,\n \ \ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199495\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3578878748370274,\n\ \ \"acc_stderr\": 0.012243563850490313,\n \"acc_norm\": 0.3578878748370274,\n\ \ \"acc_norm_stderr\": 0.012243563850490313\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.03016191193076711,\n\ \ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.03016191193076711\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.49836601307189543,\n \"acc_stderr\": 0.020227726838150117,\n \ \ \"acc_norm\": 0.49836601307189543,\n \"acc_norm_stderr\": 0.020227726838150117\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\ \ \"acc_stderr\": 0.04494290866252089,\n \"acc_norm\": 0.6727272727272727,\n\ \ \"acc_norm_stderr\": 0.04494290866252089\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.031251275910891656,\n\ \ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.031251275910891656\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n\ \ \"acc_stderr\": 0.03307615947979033,\n \"acc_norm\": 0.6766169154228856,\n\ \ \"acc_norm_stderr\": 0.03307615947979033\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \ \ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\ \ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\ \ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n\ \ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3108935128518972,\n\ \ \"mc1_stderr\": 0.016203316673559693,\n \"mc2\": 0.4616473915095851,\n\ \ \"mc2_stderr\": 0.014431098139511664\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7371744277821626,\n \"acc_stderr\": 0.01237092252726201\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.047763457164518575,\n \ \ \"acc_stderr\": 0.00587438753622932\n }\n}\n```" repo_url: https://huggingface.co/Severian/ANIMA-Nectar-v3 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|arc:challenge|25_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-12-09T16-02-02.105784.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|gsm8k|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hellaswag|10_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-09T16-02-02.105784.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-management|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-virology|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-02-02.105784.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|truthfulqa:mc|0_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-12-09T16-02-02.105784.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_12_09T16_02_02.105784 path: - '**/details_harness|winogrande|5_2023-12-09T16-02-02.105784.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-12-09T16-02-02.105784.parquet' - config_name: results data_files: - split: 2023_12_09T16_02_02.105784 path: - results_2023-12-09T16-02-02.105784.parquet - split: latest path: - results_2023-12-09T16-02-02.105784.parquet --- # Dataset Card for Evaluation run of Severian/ANIMA-Nectar-v3 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Severian/ANIMA-Nectar-v3 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Severian/ANIMA-Nectar-v3](https://huggingface.co/Severian/ANIMA-Nectar-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Severian__ANIMA-Nectar-v3", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T16:02:02.105784](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Nectar-v3/blob/main/results_2023-12-09T16-02-02.105784.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5279231991868109, "acc_stderr": 0.034095739528534785, "acc_norm": 0.5365465936132023, "acc_norm_stderr": 0.0349316183151297, "mc1": 0.3108935128518972, "mc1_stderr": 0.016203316673559693, "mc2": 0.4616473915095851, "mc2_stderr": 0.014431098139511664 }, "harness|arc:challenge|25": { "acc": 0.454778156996587, "acc_stderr": 0.014551507060836353, "acc_norm": 0.4948805460750853, "acc_norm_stderr": 0.014610624890309154 }, "harness|hellaswag|10": { "acc": 0.5621390161322446, "acc_stderr": 0.004951097802775953, "acc_norm": 0.7599083847839075, "acc_norm_stderr": 0.004262659388824526 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4888888888888889, "acc_stderr": 0.04318275491977976, "acc_norm": 0.4888888888888889, "acc_norm_stderr": 0.04318275491977976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.506578947368421, "acc_stderr": 0.040685900502249704, "acc_norm": 0.506578947368421, "acc_norm_stderr": 0.040685900502249704 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5962264150943396, "acc_stderr": 0.03019761160019795, "acc_norm": 0.5962264150943396, "acc_norm_stderr": 0.03019761160019795 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5416666666666666, "acc_stderr": 0.04166666666666665, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.04166666666666665 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.45, "acc_stderr": 0.04999999999999999, "acc_norm": 0.45, "acc_norm_stderr": 0.04999999999999999 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5433526011560693, "acc_stderr": 0.03798106566014498, "acc_norm": 0.5433526011560693, "acc_norm_stderr": 0.03798106566014498 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.30392156862745096, "acc_stderr": 0.04576665403207763, "acc_norm": 0.30392156862745096, "acc_norm_stderr": 0.04576665403207763 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4723404255319149, "acc_stderr": 0.03263597118409769, "acc_norm": 0.4723404255319149, "acc_norm_stderr": 0.03263597118409769 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.37719298245614036, "acc_stderr": 0.04559522141958216, "acc_norm": 0.37719298245614036, "acc_norm_stderr": 0.04559522141958216 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5103448275862069, "acc_stderr": 0.04165774775728763, "acc_norm": 0.5103448275862069, "acc_norm_stderr": 0.04165774775728763 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3835978835978836, "acc_stderr": 0.025043757318520196, "acc_norm": 0.3835978835978836, "acc_norm_stderr": 0.025043757318520196 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3253968253968254, "acc_stderr": 0.041905964388711366, "acc_norm": 0.3253968253968254, "acc_norm_stderr": 0.041905964388711366 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6258064516129033, "acc_stderr": 0.027528904299845704, "acc_norm": 0.6258064516129033, "acc_norm_stderr": 0.027528904299845704 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.43842364532019706, "acc_stderr": 0.03491207857486517, "acc_norm": 0.43842364532019706, "acc_norm_stderr": 0.03491207857486517 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6484848484848484, "acc_stderr": 0.037282069986826503, "acc_norm": 0.6484848484848484, "acc_norm_stderr": 0.037282069986826503 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6767676767676768, "acc_stderr": 0.033322999210706444, "acc_norm": 0.6767676767676768, "acc_norm_stderr": 0.033322999210706444 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7098445595854922, "acc_stderr": 0.032752644677915166, "acc_norm": 0.7098445595854922, "acc_norm_stderr": 0.032752644677915166 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4641025641025641, "acc_stderr": 0.02528558599001784, "acc_norm": 0.4641025641025641, "acc_norm_stderr": 0.02528558599001784 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.28888888888888886, "acc_stderr": 0.027634907264178544, "acc_norm": 0.28888888888888886, "acc_norm_stderr": 0.027634907264178544 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.46218487394957986, "acc_stderr": 0.032385469487589795, "acc_norm": 0.46218487394957986, "acc_norm_stderr": 0.032385469487589795 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.0395802723112157, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.0395802723112157 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7045871559633028, "acc_stderr": 0.019560619182976, "acc_norm": 0.7045871559633028, "acc_norm_stderr": 0.019560619182976 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3472222222222222, "acc_stderr": 0.032468872436376486, "acc_norm": 0.3472222222222222, "acc_norm_stderr": 0.032468872436376486 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6176470588235294, "acc_stderr": 0.034107853389047205, "acc_norm": 0.6176470588235294, "acc_norm_stderr": 0.034107853389047205 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6624472573839663, "acc_stderr": 0.030781549102026223, "acc_norm": 0.6624472573839663, "acc_norm_stderr": 0.030781549102026223 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.600896860986547, "acc_stderr": 0.03286745312567961, "acc_norm": 0.600896860986547, "acc_norm_stderr": 0.03286745312567961 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6335877862595419, "acc_stderr": 0.04225875451969638, "acc_norm": 0.6335877862595419, "acc_norm_stderr": 0.04225875451969638 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6776859504132231, "acc_stderr": 0.042664163633521685, "acc_norm": 0.6776859504132231, "acc_norm_stderr": 0.042664163633521685 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6574074074074074, "acc_stderr": 0.045879047413018105, "acc_norm": 0.6574074074074074, "acc_norm_stderr": 0.045879047413018105 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6687116564417178, "acc_stderr": 0.03697983910025588, "acc_norm": 0.6687116564417178, "acc_norm_stderr": 0.03697983910025588 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.6990291262135923, "acc_stderr": 0.045416094465039476, "acc_norm": 0.6990291262135923, "acc_norm_stderr": 0.045416094465039476 }, "harness|hendrycksTest-marketing|5": { "acc": 0.811965811965812, "acc_stderr": 0.025598193686652265, "acc_norm": 0.811965811965812, "acc_norm_stderr": 0.025598193686652265 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7113665389527458, "acc_stderr": 0.016203792703197786, "acc_norm": 0.7113665389527458, "acc_norm_stderr": 0.016203792703197786 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5346820809248555, "acc_stderr": 0.026854257928258893, "acc_norm": 0.5346820809248555, "acc_norm_stderr": 0.026854257928258893 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.33631284916201115, "acc_stderr": 0.015801003729145894, "acc_norm": 0.33631284916201115, "acc_norm_stderr": 0.015801003729145894 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5228758169934641, "acc_stderr": 0.028599936776089782, "acc_norm": 0.5228758169934641, "acc_norm_stderr": 0.028599936776089782 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6141479099678456, "acc_stderr": 0.027648149599751468, "acc_norm": 0.6141479099678456, "acc_norm_stderr": 0.027648149599751468 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6265432098765432, "acc_stderr": 0.02691500301138016, "acc_norm": 0.6265432098765432, "acc_norm_stderr": 0.02691500301138016 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3617021276595745, "acc_stderr": 0.028663820147199495, "acc_norm": 0.3617021276595745, "acc_norm_stderr": 0.028663820147199495 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3578878748370274, "acc_stderr": 0.012243563850490313, "acc_norm": 0.3578878748370274, "acc_norm_stderr": 0.012243563850490313 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4411764705882353, "acc_stderr": 0.03016191193076711, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.03016191193076711 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.49836601307189543, "acc_stderr": 0.020227726838150117, "acc_norm": 0.49836601307189543, "acc_norm_stderr": 0.020227726838150117 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.04494290866252089, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.04494290866252089 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6081632653061224, "acc_stderr": 0.031251275910891656, "acc_norm": 0.6081632653061224, "acc_norm_stderr": 0.031251275910891656 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6766169154228856, "acc_stderr": 0.03307615947979033, "acc_norm": 0.6766169154228856, "acc_norm_stderr": 0.03307615947979033 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.03942772444036624, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036624 }, "harness|hendrycksTest-virology|5": { "acc": 0.42168674698795183, "acc_stderr": 0.03844453181770917, "acc_norm": 0.42168674698795183, "acc_norm_stderr": 0.03844453181770917 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7134502923976608, "acc_stderr": 0.03467826685703826, "acc_norm": 0.7134502923976608, "acc_norm_stderr": 0.03467826685703826 }, "harness|truthfulqa:mc|0": { "mc1": 0.3108935128518972, "mc1_stderr": 0.016203316673559693, "mc2": 0.4616473915095851, "mc2_stderr": 0.014431098139511664 }, "harness|winogrande|5": { "acc": 0.7371744277821626, "acc_stderr": 0.01237092252726201 }, "harness|gsm8k|5": { "acc": 0.047763457164518575, "acc_stderr": 0.00587438753622932 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
wdndev/webnovel-chinese
--- license: apache-2.0 task_categories: - text-generation language: - zh tags: - llm - pretrain size_categories: - 1B<n<10B --- ## 简介 搜集网络上的网文小说,清洗,分割后,用于训练大语言模型,共计9000本左右,大约9B左右token。 ## 使用 ### 格式说明 采用`jsonl`格式存储,分为三个字段: - `title` :小说名称 - `chapter`:章节 - `text`:正文内容 示例: ```json {"title": "斗破苍穹", "chapter": " 第一章 陨落的天才", "text": "“斗之力,三段!”\n望着测验魔石碑上面闪亮得甚至有些刺眼的五个大字,少年面无表情,唇角有着一抹自嘲,紧握的手掌,因为大力,而导致略微尖锐的指甲深深的刺进了掌心之中,带来一阵阵钻心的疼痛……\n“萧炎,斗之力,三段!级别:低级!”测验魔石碑之旁,一位中年男子,看了一眼碑上所显示出来的信息,语气漠然的将之公布了出来……\n"} ```
MohammadHarrisCallME/_NetProgrammingBasics
--- license: llama2 ---
joey234/mmlu-logical_fallacies-original-neg-prepend
--- dataset_info: features: - name: question dtype: string - name: choices sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: neg_prompt dtype: string splits: - name: test num_bytes: 17815 num_examples: 35 download_size: 14481 dataset_size: 17815 --- # Dataset Card for "mmlu-logical_fallacies-original-neg-prepend" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
peterpanpan/stackoverflow-kubernetes-questions
--- license: apache-2.0 --- covert from `https://huggingface.co/datasets/mcipriano/stackoverflow-kubernetes-questions/blob/main/README.md` format from parquet to csv coverting code as below ``` import pandas as pd from pandas import read_parquet data = read_parquet("~/Downloads/kubernetes_dump.parquet") #print(data.count()) #data.head() data.to_csv('/tmp/out.csv', index=False) ```
AIVOICES123424/yuri
--- dataset_info: features: - name: audio dtype: audio - name: text dtype: string splits: - name: train num_bytes: 352874.0 num_examples: 1 download_size: 304174 dataset_size: 352874.0 configs: - config_name: default data_files: - split: train path: data/train-* ---
Shivansh2310/trinity-dolly-10k
--- dataset_info: features: - name: instruction dtype: string - name: context dtype: string - name: response dtype: string - name: category dtype: string - name: text dtype: string splits: - name: train num_bytes: 16392818 num_examples: 10000 download_size: 10078470 dataset_size: 16392818 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "trinity-dolly-10k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
HES-XPLAIN/SportsImageClassificationOld
--- task_categories: - image-classification language: - en tags: - sports size_categories: - 100M<n<1B ---
distilled-one-sec-cv12-each-chunk-uniq/chunk_82
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 1365922856.0 num_examples: 266158 download_size: 1398238480 dataset_size: 1365922856.0 --- # Dataset Card for "chunk_82" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
liuyanchen1015/MULTI_VALUE_cola_who_which
--- dataset_info: features: - name: sentence dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: dev num_bytes: 2049 num_examples: 23 - name: test num_bytes: 1406 num_examples: 17 - name: train num_bytes: 22193 num_examples: 245 download_size: 17836 dataset_size: 25648 --- # Dataset Card for "MULTI_VALUE_cola_who_which" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/wikitext-103-raw-v1-sent-permute-3
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 2181716652 num_examples: 7205397 - name: validation num_bytes: 1159288 num_examples: 3760 - name: test num_bytes: 1305088 num_examples: 4358 download_size: 1264108325 dataset_size: 2184181028 --- # Dataset Card for "wikitext-103-raw-v1-sent-permute-3" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ted_multi
--- pretty_name: TEDMulti paperswithcode_id: null dataset_info: features: - name: translations dtype: translation_variable_languages: languages: - ar - az - be - bg - bn - bs - calv - cs - da - de - el - en - eo - es - et - eu - fa - fi - fr - fr-ca - gl - he - hi - hr - hu - hy - id - it - ja - ka - kk - ko - ku - lt - mk - mn - mr - ms - my - nb - nl - pl - pt - pt-br - ro - ru - sk - sl - sq - sr - sv - ta - th - tr - uk - ur - vi - zh - zh-cn - zh-tw num_languages: 60 - name: talk_name dtype: string config_name: plain_text splits: - name: test num_bytes: 23364983 num_examples: 7213 - name: train num_bytes: 748209995 num_examples: 258098 - name: validation num_bytes: 19435383 num_examples: 6049 download_size: 352222045 dataset_size: 791010361 --- # Dataset Card for "ted_multi" ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** [https://github.com/neulab/word-embeddings-for-nmt](https://github.com/neulab/word-embeddings-for-nmt) - **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Size of downloaded dataset files:** 352.23 MB - **Size of the generated dataset:** 791.01 MB - **Total amount of disk used:** 1.14 GB ### Dataset Summary Massively multilingual (60 language) data set derived from TED Talk transcripts. Each record consists of parallel arrays of language and text. Missing and incomplete translations will be filtered out. ### Supported Tasks and Leaderboards [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Languages [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Dataset Structure ### Data Instances #### plain_text - **Size of downloaded dataset files:** 352.23 MB - **Size of the generated dataset:** 791.01 MB - **Total amount of disk used:** 1.14 GB An example of 'validation' looks as follows. ``` This example was too long and was cropped: { "talk_name": "shabana_basij_rasikh_dare_to_educate_afghan_girls", "translations": "{\"language\": [\"ar\", \"az\", \"bg\", \"bn\", \"cs\", \"da\", \"de\", \"el\", \"en\", \"es\", \"fa\", \"fr\", \"he\", \"hi\", \"hr\", \"hu\", \"hy\", \"id\", \"it\", ..." } ``` ### Data Fields The data fields are the same among all splits. #### plain_text - `translations`: a multilingual `string` variable, with possible languages including `ar`, `az`, `be`, `bg`, `bn`. - `talk_name`: a `string` feature. ### Data Splits | name |train |validation|test| |----------|-----:|---------:|---:| |plain_text|258098| 6049|7213| ## Dataset Creation ### Curation Rationale [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Source Data #### Initial Data Collection and Normalization [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the source language producers? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Annotations #### Annotation process [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the annotators? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Personal and Sensitive Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Discussion of Biases [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Other Known Limitations [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Additional Information ### Dataset Curators [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Licensing Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Citation Information ``` @InProceedings{qi-EtAl:2018:N18-2, author = {Qi, Ye and Sachan, Devendra and Felix, Matthieu and Padmanabhan, Sarguna and Neubig, Graham}, title = {When and Why Are Pre-Trained Word Embeddings Useful for Neural Machine Translation?}, booktitle = {Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)}, month = {June}, year = {2018}, address = {New Orleans, Louisiana}, publisher = {Association for Computational Linguistics}, pages = {529--535}, abstract = {The performance of Neural Machine Translation (NMT) systems often suffers in low-resource scenarios where sufficiently large-scale parallel corpora cannot be obtained. Pre-trained word embeddings have proven to be invaluable for improving performance in natural language analysis tasks, which often suffer from paucity of data. However, their utility for NMT has not been extensively explored. In this work, we perform five sets of experiments that analyze when we can expect pre-trained word embeddings to help in NMT tasks. We show that such embeddings can be surprisingly effective in some cases -- providing gains of up to 20 BLEU points in the most favorable setting.}, url = {http://www.aclweb.org/anthology/N18-2084} } ``` ### Contributions Thanks to [@thomwolf](https://github.com/thomwolf), [@lewtun](https://github.com/lewtun), [@patrickvonplaten](https://github.com/patrickvonplaten) for adding this dataset.
usvsnsp/deduped-num-frequencies
--- dataset_info: features: - name: TokenID dtype: int64 - name: Frequency dtype: int64 splits: - name: memorized num_bytes: 960000 num_examples: 60000 - name: non_memorized num_bytes: 960000 num_examples: 60000 - name: total num_bytes: 960000 num_examples: 60000 download_size: 1974196 dataset_size: 2880000 --- # Dataset Card for "deduped-num-frequencies" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Om007/kendal_bot
--- task_categories: - question-answering language: - en --- # Dataset Card for Kendal <!-- Provide a quick summary of the dataset. --> This is a dataset of for Kendal Bot. ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
xppast/voice
--- license: mit dataset_info: features: - name: audio dtype: audio: sampling_rate: 16000 - name: transcripts dtype: string splits: - name: train num_bytes: 4785859.5 num_examples: 33 - name: test num_bytes: 1138202.1666666667 num_examples: 5 - name: valid num_bytes: 895347.3333333334 num_examples: 4 download_size: 6710917 dataset_size: 6819409.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* - split: valid path: data/valid-* ---
DBQ/Prada.Product.prices.Germany
--- annotations_creators: - other language_creators: - other language: - en license: - unknown multilinguality: - monolingual source_datasets: - original task_categories: - text-classification - image-classification - feature-extraction - image-segmentation - image-to-image - image-to-text - object-detection - summarization - zero-shot-image-classification pretty_name: Germany - Prada - Product-level price list tags: - webscraping - ecommerce - Prada - fashion - fashion product - image - fashion image configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: website_name dtype: string - name: competence_date dtype: string - name: country_code dtype: string - name: currency_code dtype: string - name: brand dtype: string - name: category1_code dtype: string - name: category2_code dtype: string - name: category3_code dtype: string - name: product_code dtype: string - name: title dtype: string - name: itemurl dtype: string - name: imageurl dtype: string - name: full_price dtype: float64 - name: price dtype: float64 - name: full_price_eur dtype: float64 - name: price_eur dtype: float64 - name: flg_discount dtype: int64 splits: - name: train num_bytes: 1315247 num_examples: 2588 download_size: 373618 dataset_size: 1315247 --- # Prada web scraped data ## About the website The **Luxury Fashion Industry** in the **EMEA** region, particularly in **Germany**, has experienced significant transformation in recent years. The face of the industry is continuously changing, with fashion houses like **Prada** sitting at the forefront of this evolution. With an increasing number of consumers turning to online platforms for shopping, the digitalization process has been greatly accelerated for high-end fashion labels. The **ecommerce** sector has thus become instrumental in driving sales. The dataset observed includes **Ecommerce product-list page (PLP) data** on **Prada** in Germany, offering a fresh perspective on buying habits, consumer behaviour, and fashion trends within this evolving industry in the region. ## Link to **dataset** [Germany - Prada - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Prada%20Product-prices%20Germany/r/recTwfXDGd6805PNU)
richwardle/reduced-imagenet
--- license: apache-2.0 size_categories: - 10K<n<100K task_categories: - image-feature-extraction dataset_info: features: - name: image dtype: image splits: - name: train num_bytes: 2156767982.0 num_examples: 26000 download_size: 2183967663 dataset_size: 2156767982.0 configs: - config_name: default data_files: - split: train path: data/train-* --- # Imagenet Mini Dataset This dataset is a subset of the Imagenet validation set containing 26,000 images. It has been curated to have equal class distributions, with 26 randomly sampled images from each class. All images have been resized to (224, 224) pixels, and are in RGB format. ## Citation If you use this dataset in your research, please cite the original Imagenet dataset: Deng, J., Dong, W., Socher, R., Li, L.-J., Li, K., & Fei-Fei, L. (2009). Imagenet: A large-scale hierarchical image database. In 2009 IEEE conference on computer vision and pattern recognition (pp. 248–255).
linqus/tokenized-codeparrot-ds-small
--- dataset_info: features: - name: input_ids sequence: int32 splits: - name: train num_bytes: 708311652 num_examples: 1372697 - name: valid num_bytes: 7259088 num_examples: 14068 download_size: 313854357 dataset_size: 715570740 configs: - config_name: default data_files: - split: train path: data/train-* - split: valid path: data/valid-* ---
bethgelab/frequency_determines_performance
--- license: mit task_categories: - zero-shot-classification - feature-extraction language: - en size_categories: - n<1K --- **Frequency estimation results and tagged samples:** `counts_and_indices.zip` contains all the result jsons (for the estimated frequencies for image-only, text-only and image-text searches) and the sample indices that are tagged to each concept for the LAION400m/LAION-Aesthetics datasets. **Constructed dictionaries and other pretraining and downstream data artefacts:** Due to the large size of all our data artefacts, we release our dictionaries and other feature artefacts as split files of a 110GB sized zip file (named as `features_zip_part_aa`, `features_zip_part_ab` and `features_zip_part_ac`). Please download the individual split files, and then manually combine them to reconstruct the original zip file like this: ```bash cat features_zip_part_aa features_zip_part_ab features_zip_part_ac > features.zip ``` Once combined, please verify that the file is correctly transferred by comparing the md5sum hash of the file: ```bash md5sum features.zip ``` The output hash should be: `11f6339df3206257efdfc4a54dd7ca60 features.zip` For more details, see our [github repository](https://github.com/bethgelab/frequency_determines_performance)
thefrankhsu/hate_speech_twitter
--- task_categories: - text-classification language: - en tags: - health - tweet - hate speech - mental health - hate speech detection - hate speech classification - social media - mobile health size_categories: - 1K<n<10K --- ## Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> The dataset is designed to analyze and address hate speech within online platforms. It consists of two sets: the training and testing sets. The two datasets have been labeled and categorized instances of hate speech into nine distinct categories. ## Dataset Description <!-- Provide a longer summary of what this dataset is. --> The dataset comprises three key features: tweets, labels (with hate speech denoted as 1 and non-hate speech as 0), and categories (behavior, class, disability, ethnicity, gender, physical appearance, race, religion, sexual orientation). * Training set: contains a total of 5679 tweets (Hate Speech: 1516 / Non Hate Speech: 4163), and the number of hate speech in each category is not equally distributed. * Testing set: contains a total of 1000 tweets (Hate Speech: 500 / Non Hate Speech: 500), and the number of hate speech in each category is generally even. ## Uses This dataset can be utilized for various purposes, including but not limited to: * Developing and training machine learning models for hate speech detection. * Analyzing the prevalence and patterns of hate speech across different categories. * Understanding the challenges associated with categorizing hate speech on social media platforms. Check it out for the example [project](https://github.com/Wei-Hsi/AI4health)! ## Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> The dataset utilized in this study is sourced from Kaggle and named the [Hate Speech and Offensive Language dataset](https://www.kaggle.com/datasets/mrmorj/hate-speech-and-offensive-language-dataset/). Hate speech instances are identified by selecting tweets within the "class" column. ## Annotations <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> Category labels were generated through an OpenAI API call employing the GPT-3.5 model. It's important to note the instability in category predictions when utilizing GPT-3.5 for label generation, as it tends to predict different categories each time. However, we have confirmed that these tweets were labeled correctly. If there are any misclassified labels, please feel free to reach out. Thank you in advance for your assistance. ## Dataset Card Contact Please feel free to contact me via wh476@cornell.edu!
mixamrepijey/gorilla-hf
--- license: apache-2.0 ---
Lostkyd/InstrucData
--- dataset_info: features: - name: Instruction dtype: string - name: Input dtype: string - name: Output dtype: string splits: - name: train num_bytes: 173299 num_examples: 90 download_size: 40079 dataset_size: 173299 configs: - config_name: default data_files: - split: train path: data/train-* ---
CyberHarem/mujina_azurlane
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of mujina/ムジナ/貉SSSS (Azur Lane) This is the dataset of mujina/ムジナ/貉SSSS (Azur Lane), containing 283 images and their tags. The core tags of this character are `short_hair, breasts, blue_eyes, large_breasts, bangs`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 283 | 368.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mujina_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 283 | 198.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mujina_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 693 | 420.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mujina_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 283 | 325.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mujina_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 693 | 606.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mujina_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/mujina_azurlane', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, corset, military_jacket, purple_shorts, solo, white_gloves, white_jacket, looking_at_viewer, short_necktie, thighs, purple_necktie, underbust, white_background, white_shirt, short_shorts, collared_shirt, long_sleeves, open_jacket, simple_background, white_footwear, blush | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, cleavage, purple_bikini, solo, thighs, bare_shoulders, blush, looking_at_viewer, navel, collarbone, beach, blue_sky, day, ocean, outdoors, pink_hair, sitting, wet, cloud, crossed_legs, hair_between_eyes | | 2 | 16 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, cleavage, navel, solo, looking_at_viewer, purple_bikini, white_background, brown_hair, simple_background, thighs, bare_shoulders, sitting, wet | | 3 | 36 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, hetero, 1boy, blush, penis, solo_focus, nipples, open_mouth, sex, mosaic_censoring, completely_nude, vaginal, collarbone, pussy, sweat, navel, looking_at_viewer, spread_legs, thighs, cum, brown_hair, lying, hair_between_eyes | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | corset | military_jacket | purple_shorts | solo | white_gloves | white_jacket | looking_at_viewer | short_necktie | thighs | purple_necktie | underbust | white_background | white_shirt | short_shorts | collared_shirt | long_sleeves | open_jacket | simple_background | white_footwear | blush | cleavage | purple_bikini | bare_shoulders | navel | collarbone | beach | blue_sky | day | ocean | outdoors | pink_hair | sitting | wet | cloud | crossed_legs | hair_between_eyes | brown_hair | hetero | 1boy | penis | solo_focus | nipples | open_mouth | sex | mosaic_censoring | completely_nude | vaginal | pussy | sweat | spread_legs | cum | lying | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:------------------|:----------------|:-------|:---------------|:---------------|:--------------------|:----------------|:---------|:-----------------|:------------|:-------------------|:--------------|:---------------|:-----------------|:---------------|:--------------|:--------------------|:-----------------|:--------|:-----------|:----------------|:-----------------|:--------|:-------------|:--------|:-----------|:------|:--------|:-----------|:------------|:----------|:------|:--------|:---------------|:--------------------|:-------------|:---------|:-------|:--------|:-------------|:----------|:-------------|:------|:-------------------|:------------------|:----------|:--------|:--------|:--------------|:------|:--------| | 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | | | X | | | X | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | 2 | 16 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | | X | | | X | | X | | | X | | | | | | X | | | X | X | X | X | | | | | | | | X | X | | | | X | | | | | | | | | | | | | | | | | 3 | 36 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | | | | | | X | | X | | | | | | | | | | | X | | | | X | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
joachimsallstrom/mjportraits_and_blurred
--- license: creativeml-openrail-m ---
manishh16/crack
--- dataset_info: features: - name: pixel_values dtype: image - name: label dtype: image splits: - name: train num_bytes: 249182453.0 num_examples: 31 download_size: 22493785 dataset_size: 249182453.0 configs: - config_name: default data_files: - split: train path: data/train-* ---
mcemilg/turkish-plu-goal-inference
--- task_categories: - text-classification language: - tr size_categories: - 100K<n<1M --- Homepage: https://github.com/GGLAB-KU/turkish-plu
theBrokenCat/EdificioPereda
--- dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 452640209.0 num_examples: 64 download_size: 444672075 dataset_size: 452640209.0 configs: - config_name: default data_files: - split: train path: data/train-* ---
liuyanchen1015/MULTI_VALUE_mnli_regularized_reflexives_aave
--- dataset_info: features: - name: premise dtype: string - name: hypothesis dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: score dtype: int64 splits: - name: dev_matched num_bytes: 19251 num_examples: 94 - name: dev_mismatched num_bytes: 23121 num_examples: 87 - name: test_matched num_bytes: 21604 num_examples: 90 - name: test_mismatched num_bytes: 20670 num_examples: 82 - name: train num_bytes: 936051 num_examples: 3883 download_size: 578604 dataset_size: 1020697 --- # Dataset Card for "MULTI_VALUE_mnli_regularized_reflexives_aave" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
bengaliAI/cvbn
--- license: cc ---
zhan1993/ARB_transfer_matrix_v2
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: expert_name dtype: string - name: task_eval_on dtype: string - name: score dtype: float64 splits: - name: train num_bytes: 15849 num_examples: 356 download_size: 10642 dataset_size: 15849 --- # Dataset Card for "ARB_transfer_matrix_v2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Serverless/dev_mode-wtq
--- annotations_creators: - crowdsourced language_creators: - found language: - en license: - cc-by-4.0 multilinguality: - monolingual paperswithcode_id: null pretty_name: WikiTableQuestions-wtq size_categories: - 10K<n<100K source_datasets: - wikitablequestions task_categories: - question-answering task_ids: [] tags: - table-question-answering --- # Dataset Card for dev_mode-wtq ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-instances) - [Data Splits](#data-instances) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) ## Dataset Description - **Homepage:** [WikiTableQuestions homepage](https://nlp.stanford.edu/software/sempre/wikitable) - **Repository:** [WikiTableQuestions repository](https://github.com/ppasupat/WikiTableQuestions) - **Paper:** [Compositional Semantic Parsing on Semi-Structured Tables](https://arxiv.org/abs/1508.00305) - **Leaderboard:** [WikiTableQuestions leaderboard on PaperWithCode](https://paperswithcode.com/dataset/wikitablequestions) - **Point of Contact:** [Needs More Information] ### Dataset Summary The dev_mode-wtq dataset is a small-scale dataset for the task of question answering on semi-structured tables. This data includes the `aggregation_label` and `answer_coordinates` to make it easy to train this model on any [TAPAS](https://huggingface.co/docs/transformers/model_doc/tapas#usage-finetuning) based modles. ### Supported Tasks and Leaderboards question-answering, table-question-answering ### Languages en ## Dataset Structure ### Data Instances #### default - **Size of downloaded dataset files:** 27.91 MB - **Size of the generated dataset:** 45.68 MB - **Total amount of disk used:** 73.60 MB An example of 'validation' looks as follows: ``` { "id": "nt-0", "question": "What is the duration for the last invocation?", "answers": [ "340 ms" ], "table": { "header": [ "recent", "type", "spans", "logs", "errors", "warnings", "duration", "resource" ], "rows": [ [ "1", "span", "1", "1", "1", "2", "340 ms", "aws-lambda-typescript-express-dev-express" ] ] } } ``` ### Data Fields The data fields are the same among all splits. #### default - `id`: a `string` feature. - `question`: a `string` feature. - `answers`: a `list` of `string` feature. - `answers_coordinates`: a `list` of `int,int` tuples. - `aggregation_label`: a `string` feature. - `table`: a dictionary feature containing: - `header`: a `list` of `string` features. - `rows`: a `list` of `list` of `string` features: - `name`: a `string` feature. ### Data Splits TBA ## Dataset Creation ### Curation Rationale [Needs More Information] ### Source Data #### Initial Data Collection and Normalization [Needs More Information] #### Who are the source language producers? [Needs More Information] ### Annotations #### Annotation process [Needs More Information] #### Who are the annotators? [Needs More Information] ### Personal and Sensitive Information [Needs More Information] ## Considerations for Using the Data ### Social Impact of Dataset [Needs More Information] ### Discussion of Biases [Needs More Information] ### Other Known Limitations [Needs More Information] ## Additional Information ### Dataset Curators Panupong Pasupat and Percy Liang ### Licensing Information Creative Commons Attribution Share Alike 4.0 International ### Citation Information ``` @inproceedings{pasupat-liang-2015-compositional, title = "Compositional Semantic Parsing on Semi-Structured Tables", author = "Pasupat, Panupong and Liang, Percy", booktitle = "Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)", month = jul, year = "2015", address = "Beijing, China", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/P15-1142", doi = "10.3115/v1/P15-1142", pages = "1470--1480", } ``` ### Contributions Thanks to [@SivilTaram](https://github.com/SivilTaram) for adding this dataset.
dariatsisar/sample_dataset
--- dataset_info: features: - name: text dtype: string - name: label dtype: class_label: names: '0': science/technology '1': travel '2': politics '3': health splits: - name: train num_bytes: 104902 num_examples: 394 - name: test num_bytes: 24642 num_examples: 99 download_size: 76286 dataset_size: 129544 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
amktk/ktkDataSet
--- dataset_info: features: - name: audio dtype: audio - name: transctiption dtype: string splits: - name: train num_bytes: 71647032.0 num_examples: 10 download_size: 60508649 dataset_size: 71647032.0 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "ktkDataSet" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
PrabhaB/guanaco-llama2-1k
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 1654448 num_examples: 1000 download_size: 966692 dataset_size: 1654448 configs: - config_name: default data_files: - split: train path: data/train-* ---
Cohere/wikipedia-22-12-zh-embeddings
--- language: - zh multilinguality: - multilingual size_categories: [] source_datasets: [] tags: [] task_categories: - text-retrieval license: - apache-2.0 task_ids: - document-retrieval --- # Wikipedia (zh) embedded with cohere.ai `multilingual-22-12` encoder We encoded [Wikipedia (zh)](https://zh.wikipedia.org) using the [cohere.ai](https://txt.cohere.ai/multilingual/) `multilingual-22-12` embedding model. To get an overview how this dataset was created and pre-processed, have a look at [Cohere/wikipedia-22-12](https://huggingface.co/datasets/Cohere/wikipedia-22-12). ## Embeddings We compute for `title+" "+text` the embeddings using our `multilingual-22-12` embedding model, a state-of-the-art model that works for semantic search in 100 languages. If you want to learn more about this model, have a look at [cohere.ai multilingual embedding model](https://txt.cohere.ai/multilingual/). ## Further languages We provide embeddings of Wikipedia in many different languages: [ar](https://huggingface.co/datasets/Cohere/wikipedia-22-12-ar-embeddings), [de](https://huggingface.co/datasets/Cohere/wikipedia-22-12-de-embeddings), [en](https://huggingface.co/datasets/Cohere/wikipedia-22-12-en-embeddings), [es](https://huggingface.co/datasets/Cohere/wikipedia-22-12-es-embeddings), [fr](https://huggingface.co/datasets/Cohere/wikipedia-22-12-fr-embeddings), [hi](https://huggingface.co/datasets/Cohere/wikipedia-22-12-hi-embeddings), [it](https://huggingface.co/datasets/Cohere/wikipedia-22-12-it-embeddings), [ja](https://huggingface.co/datasets/Cohere/wikipedia-22-12-ja-embeddings), [ko](https://huggingface.co/datasets/Cohere/wikipedia-22-12-ko-embeddings), [simple english](https://huggingface.co/datasets/Cohere/wikipedia-22-12-simple-embeddings), [zh](https://huggingface.co/datasets/Cohere/wikipedia-22-12-zh-embeddings), You can find the Wikipedia datasets without embeddings at [Cohere/wikipedia-22-12](https://huggingface.co/datasets/Cohere/wikipedia-22-12). ## Loading the dataset You can either load the dataset like this: ```python from datasets import load_dataset docs = load_dataset(f"Cohere/wikipedia-22-12-zh-embeddings", split="train") ``` Or you can also stream it without downloading it before: ```python from datasets import load_dataset docs = load_dataset(f"Cohere/wikipedia-22-12-zh-embeddings", split="train", streaming=True) for doc in docs: docid = doc['id'] title = doc['title'] text = doc['text'] emb = doc['emb'] ``` ## Search A full search example: ```python #Run: pip install cohere datasets from datasets import load_dataset import torch import cohere co = cohere.Client(f"<<COHERE_API_KEY>>") # Add your cohere API key from www.cohere.com #Load at max 1000 documents + embeddings max_docs = 1000 docs_stream = load_dataset(f"Cohere/wikipedia-22-12-zh-embeddings", split="train", streaming=True) docs = [] doc_embeddings = [] for doc in docs_stream: docs.append(doc) doc_embeddings.append(doc['emb']) if len(docs) >= max_docs: break doc_embeddings = torch.tensor(doc_embeddings) query = 'Who founded Youtube' response = co.embed(texts=[query], model='multilingual-22-12') query_embedding = response.embeddings query_embedding = torch.tensor(query_embedding) # Compute dot score between query embedding and document embeddings dot_scores = torch.mm(query_embedding, doc_embeddings.transpose(0, 1)) top_k = torch.topk(dot_scores, k=3) # Print results print("Query:", query) for doc_id in top_k.indices[0].tolist(): print(docs[doc_id]['title']) print(docs[doc_id]['text'], "\n") ``` ## Performance You can find performance on the MIRACL dataset (a semantic search evaluation dataset) here: [miracl-en-queries-22-12#performance](https://huggingface.co/datasets/Cohere/miracl-en-queries-22-12#performance)
xedwin23x/StanfordCars
--- license: unknown ---
mboth/luftBereitstellen-100-undersampled
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* - split: valid path: data/valid-* dataset_info: features: - name: Datatype dtype: string - name: Beschreibung dtype: string - name: Name dtype: string - name: Unit dtype: string - name: text dtype: string - name: Grundfunktion dtype: string - name: ZweiteGrundfunktion dtype: string - name: label dtype: class_label: names: '0': AbluftAllgemein '1': Abluftfilter '2': Abluftklappe '3': Abluftventilator '4': Außenluftfilter '5': Außenluftklappe '6': Befeuchter '7': Erhitzer '8': Filter '9': Fortluftklappe '10': GerätAllgemein '11': Kaeltemengenzaehler '12': KlappenAllgemein '13': Kühler '14': Regler '15': Umluft '16': Ventilator '17': Wärmemengenzähler '18': Wärmerückgewinnung '19': ZuluftAllgemein '20': Zuluftfilter '21': Zuluftklappe '22': Zuluftventilator splits: - name: train num_bytes: 378107.292848404 num_examples: 1778 - name: test num_bytes: 238179 num_examples: 1124 - name: valid num_bytes: 238179 num_examples: 1124 download_size: 280245 dataset_size: 854465.292848404 --- # Dataset Card for "luftBereitstellen-100-undersampled" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ben-epstein/amazon_polarity_10_pct
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: label dtype: class_label: names: '0': negative '1': positive - name: title dtype: string - name: content dtype: string splits: - name: train num_bytes: 163359702 num_examples: 360000 - name: test num_bytes: 18182813 num_examples: 40000 download_size: 120691417 dataset_size: 181542515 --- # Amazon Polarity 10pct This is a direct subset of the original [Amazon Polarity](https://huggingface.co/datasets/amazon_polarity) dataset, downsampled 10pct with a random shuffle ### Dataset Summary For quicker testing on Amazon Polarity. See https://huggingface.co/datasets/amazon_polarity for details and attributions ### Source Data ```python from datasets import ClassLabel, Dataset, DatasetDict, load_dataset ds_full = load_dataset("amazon_polarity", streaming=True) ds_train_10_pct = Dataset.from_list(list(ds_full["train"].shuffle(seed=42).take(360_000))) ds_test_10_pct = Dataset.from_list(list(ds_full["test"].shuffle(seed=42).take(40_000))) ds_10_pct = DatasetDict({"train": ds_train_10_pct, "test": ds_test_10_pct}) # Need to recreate the class labels class_label = ClassLabel(num_classes=2, names=["negative", "positive"]) ds_10_pct = ds_10_pct.map(lambda row: {"title": row["title"], "content": row["content"], "label": "negative" if not row["label"] else "positive"}) ds_10_pct = ds_10_pct.cast_column("label", class_label) ```
aisc-team-a1/guidelines-qa-finetuning
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 2053169 num_examples: 75 download_size: 511694 dataset_size: 2053169 configs: - config_name: default data_files: - split: train path: data/train-* ---
gagan3012/areta_v3
--- dataset_info: features: - name: text sequence: string - name: detect_tags sequence: string - name: correct_tags sequence: string - name: len_text dtype: int64 - name: len_detect_tags dtype: int64 - name: len_correct_tags dtype: int64 splits: - name: train num_bytes: 96930716 num_examples: 100000 - name: validation num_bytes: 1986694 num_examples: 1017 download_size: 19852500 dataset_size: 98917410 --- # Dataset Card for "areta_v3" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Chunt0/paul_price
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 10352711.0 num_examples: 41 download_size: 10344553 dataset_size: 10352711.0 --- # Dataset Card for "paul_price" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
vilsonrodrigues/lfw
--- license: apache-2.0 --- Samples from the LFW dataset. Samples where there is one more face per user were selected. They were then partitioned into two directories: ingestion and recovery. This was done to test a facial recognition system.
arthurMM801/cnh-rg-cpf
--- dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': cnh_completo '1': cpf_completo '2': rg_completo - name: ground_truth dtype: string splits: - name: train num_bytes: 16204820.0 num_examples: 48 - name: test num_bytes: 4901507.0 num_examples: 12 download_size: 21109239 dataset_size: 21106327.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
amcoff/recept
--- annotations_creators: - no-annotation language: - sv language_creators: - found license: - mit multilinguality: - monolingual pretty_name: Recept size_categories: - 10K<n<100K source_datasets: - original tags: [] task_categories: - text-classification task_ids: [] --- # Dataset Card for Recept ### Dataset Summary [More Information Needed] ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
Jaskirat-04/fine-tunning
--- license: mit ---
heliosprime/twitter_dataset_1713167560
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 3203 num_examples: 9 download_size: 8354 dataset_size: 3203 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "twitter_dataset_1713167560" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_jxhong__CAlign-alpaca-7b
--- pretty_name: Evaluation run of jxhong/CAlign-alpaca-7b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [jxhong/CAlign-alpaca-7b](https://huggingface.co/jxhong/CAlign-alpaca-7b) on the\ \ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jxhong__CAlign-alpaca-7b\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-09-23T14:18:50.060462](https://huggingface.co/datasets/open-llm-leaderboard/details_jxhong__CAlign-alpaca-7b/blob/main/results_2023-09-23T14-18-50.060462.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.1967281879194631,\n\ \ \"em_stderr\": 0.0040710291374288195,\n \"f1\": 0.2515457214765097,\n\ \ \"f1_stderr\": 0.004085507734234057,\n \"acc\": 0.36712327209690443,\n\ \ \"acc_stderr\": 0.007903286807442752\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.1967281879194631,\n \"em_stderr\": 0.0040710291374288195,\n\ \ \"f1\": 0.2515457214765097,\n \"f1_stderr\": 0.004085507734234057\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.013646702047005308,\n \ \ \"acc_stderr\": 0.003195747075480819\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7205998421468035,\n \"acc_stderr\": 0.012610826539404686\n\ \ }\n}\n```" repo_url: https://huggingface.co/jxhong/CAlign-alpaca-7b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|arc:challenge|25_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-08-09T20:26:06.755216.parquet' - config_name: harness_drop_3 data_files: - split: 2023_09_23T14_18_50.060462 path: - '**/details_harness|drop|3_2023-09-23T14-18-50.060462.parquet' - split: latest path: - '**/details_harness|drop|3_2023-09-23T14-18-50.060462.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_09_23T14_18_50.060462 path: - '**/details_harness|gsm8k|5_2023-09-23T14-18-50.060462.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-09-23T14-18-50.060462.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hellaswag|10_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-09T20:26:06.755216.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-management|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-virology|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:26:06.755216.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_08_09T20_26_06.755216 path: - '**/details_harness|truthfulqa:mc|0_2023-08-09T20:26:06.755216.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-08-09T20:26:06.755216.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_09_23T14_18_50.060462 path: - '**/details_harness|winogrande|5_2023-09-23T14-18-50.060462.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-09-23T14-18-50.060462.parquet' - config_name: results data_files: - split: 2023_08_09T20_26_06.755216 path: - results_2023-08-09T20:26:06.755216.parquet - split: 2023_09_23T14_18_50.060462 path: - results_2023-09-23T14-18-50.060462.parquet - split: latest path: - results_2023-09-23T14-18-50.060462.parquet --- # Dataset Card for Evaluation run of jxhong/CAlign-alpaca-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/jxhong/CAlign-alpaca-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [jxhong/CAlign-alpaca-7b](https://huggingface.co/jxhong/CAlign-alpaca-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_jxhong__CAlign-alpaca-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T14:18:50.060462](https://huggingface.co/datasets/open-llm-leaderboard/details_jxhong__CAlign-alpaca-7b/blob/main/results_2023-09-23T14-18-50.060462.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.1967281879194631, "em_stderr": 0.0040710291374288195, "f1": 0.2515457214765097, "f1_stderr": 0.004085507734234057, "acc": 0.36712327209690443, "acc_stderr": 0.007903286807442752 }, "harness|drop|3": { "em": 0.1967281879194631, "em_stderr": 0.0040710291374288195, "f1": 0.2515457214765097, "f1_stderr": 0.004085507734234057 }, "harness|gsm8k|5": { "acc": 0.013646702047005308, "acc_stderr": 0.003195747075480819 }, "harness|winogrande|5": { "acc": 0.7205998421468035, "acc_stderr": 0.012610826539404686 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
AdapterOcean/code_instructions_standardized_cluster_12_alpaca
--- dataset_info: features: - name: input dtype: string - name: output dtype: string splits: - name: train num_bytes: 13843792 num_examples: 7746 download_size: 7130413 dataset_size: 13843792 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "code_instructions_standardized_cluster_12_alpaca" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
autoevaluate/autoeval-eval-billsum-default-258166-2318473352
--- type: predictions tags: - autotrain - evaluation datasets: - billsum eval_info: task: summarization model: Artifact-AI/t5_base_courtlistener_billsum metrics: [] dataset_name: billsum dataset_config: default dataset_split: test col_mapping: text: text target: summary --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Summarization * Model: Artifact-AI/t5_base_courtlistener_billsum * Dataset: billsum * Config: default * Split: test To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@Artifact-AI](https://huggingface.co/Artifact-AI) for evaluating this model.
nlpUc3mStudents/mental-risk-b
--- dataset_info: features: - name: subject_id dtype: string - name: id_message dtype: int64 - name: date dtype: string - name: message dtype: string - name: label dtype: float64 splits: - name: train num_bytes: 800039 num_examples: 6248 - name: test num_bytes: 76071 num_examples: 624 download_size: 475767 dataset_size: 876110 --- # Dataset Card for "mental-risk-b" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
DepositorOP/masterstack
--- dataset_info: features: - name: text dtype: string - name: labels dtype: float64 splits: - name: test num_bytes: 151727.48160821214 num_examples: 702 - name: train num_bytes: 1364250.5183917878 num_examples: 6312 download_size: 1016008 dataset_size: 1515978.0 --- # Dataset Card for "masterstack" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)