datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
open-llm-leaderboard/details_adamo1139__Yi-34B-200K-AEZAKMI-RAW-2301-LoRA | ---
pretty_name: Evaluation run of adamo1139/Yi-34B-200K-AEZAKMI-RAW-2301-LoRA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [adamo1139/Yi-34B-200K-AEZAKMI-RAW-2301-LoRA](https://huggingface.co/adamo1139/Yi-34B-200K-AEZAKMI-RAW-2301-LoRA)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_adamo1139__Yi-34B-200K-AEZAKMI-RAW-2301-LoRA\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-27T16:09:42.767487](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__Yi-34B-200K-AEZAKMI-RAW-2301-LoRA/blob/main/results_2024-01-27T16-09-42.767487.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7413605787370283,\n\
\ \"acc_stderr\": 0.02895069135836259,\n \"acc_norm\": 0.7476488274301629,\n\
\ \"acc_norm_stderr\": 0.029481162291123596,\n \"mc1\": 0.40758873929008566,\n\
\ \"mc1_stderr\": 0.017201949234553107,\n \"mc2\": 0.5708422092704679,\n\
\ \"mc2_stderr\": 0.015184723749426742\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.014137708601759091,\n\
\ \"acc_norm\": 0.659556313993174,\n \"acc_norm_stderr\": 0.01384746051889298\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6420035849432384,\n\
\ \"acc_stderr\": 0.0047843129724954,\n \"acc_norm\": 0.8388767177853017,\n\
\ \"acc_norm_stderr\": 0.003668932629672556\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.868421052631579,\n \"acc_stderr\": 0.027508689533549915,\n\
\ \"acc_norm\": 0.868421052631579,\n \"acc_norm_stderr\": 0.027508689533549915\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7962264150943397,\n \"acc_stderr\": 0.024790784501775402,\n\
\ \"acc_norm\": 0.7962264150943397,\n \"acc_norm_stderr\": 0.024790784501775402\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.02628055093284806,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.02628055093284806\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7167630057803468,\n\
\ \"acc_stderr\": 0.034355680560478746,\n \"acc_norm\": 0.7167630057803468,\n\
\ \"acc_norm_stderr\": 0.034355680560478746\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.04971358884367406,\n\
\ \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.04971358884367406\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7531914893617021,\n \"acc_stderr\": 0.02818544130123409,\n\
\ \"acc_norm\": 0.7531914893617021,\n \"acc_norm_stderr\": 0.02818544130123409\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7310344827586207,\n \"acc_stderr\": 0.036951833116502325,\n\
\ \"acc_norm\": 0.7310344827586207,\n \"acc_norm_stderr\": 0.036951833116502325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.0242785680243077,\n \"acc_norm\"\
: 0.6666666666666666,\n \"acc_norm_stderr\": 0.0242785680243077\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8870967741935484,\n\
\ \"acc_stderr\": 0.01800360332586361,\n \"acc_norm\": 0.8870967741935484,\n\
\ \"acc_norm_stderr\": 0.01800360332586361\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6502463054187192,\n \"acc_stderr\": 0.03355400904969566,\n\
\ \"acc_norm\": 0.6502463054187192,\n \"acc_norm_stderr\": 0.03355400904969566\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n\
\ \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9242424242424242,\n \"acc_stderr\": 0.018852670234993093,\n \"\
acc_norm\": 0.9242424242424242,\n \"acc_norm_stderr\": 0.018852670234993093\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527029,\n\
\ \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527029\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8051282051282052,\n \"acc_stderr\": 0.020083167595181393,\n\
\ \"acc_norm\": 0.8051282051282052,\n \"acc_norm_stderr\": 0.020083167595181393\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476668,\n \
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476668\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8403361344537815,\n \"acc_stderr\": 0.023793353997528802,\n\
\ \"acc_norm\": 0.8403361344537815,\n \"acc_norm_stderr\": 0.023793353997528802\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5033112582781457,\n \"acc_stderr\": 0.04082393379449654,\n \"\
acc_norm\": 0.5033112582781457,\n \"acc_norm_stderr\": 0.04082393379449654\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9119266055045872,\n \"acc_stderr\": 0.01215074371948165,\n \"\
acc_norm\": 0.9119266055045872,\n \"acc_norm_stderr\": 0.01215074371948165\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6527777777777778,\n \"acc_stderr\": 0.032468872436376486,\n \"\
acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.032468872436376486\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8970588235294118,\n \"acc_stderr\": 0.02132833757080437,\n \"\
acc_norm\": 0.8970588235294118,\n \"acc_norm_stderr\": 0.02132833757080437\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065498,\n \
\ \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065498\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7713004484304933,\n\
\ \"acc_stderr\": 0.028188240046929203,\n \"acc_norm\": 0.7713004484304933,\n\
\ \"acc_norm_stderr\": 0.028188240046929203\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.030884661089515375,\n\
\ \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.030884661089515375\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622804,\n \"\
acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622804\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8796296296296297,\n\
\ \"acc_stderr\": 0.03145703854306251,\n \"acc_norm\": 0.8796296296296297,\n\
\ \"acc_norm_stderr\": 0.03145703854306251\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8588957055214724,\n \"acc_stderr\": 0.027351605518389752,\n\
\ \"acc_norm\": 0.8588957055214724,\n \"acc_norm_stderr\": 0.027351605518389752\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n\
\ \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n\
\ \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.0339329572976101,\n\
\ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.0339329572976101\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n\
\ \"acc_stderr\": 0.01553751426325388,\n \"acc_norm\": 0.9401709401709402,\n\
\ \"acc_norm_stderr\": 0.01553751426325388\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8876117496807152,\n\
\ \"acc_stderr\": 0.011294541351216554,\n \"acc_norm\": 0.8876117496807152,\n\
\ \"acc_norm_stderr\": 0.011294541351216554\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8208092485549133,\n \"acc_stderr\": 0.020647590029679332,\n\
\ \"acc_norm\": 0.8208092485549133,\n \"acc_norm_stderr\": 0.020647590029679332\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7150837988826816,\n\
\ \"acc_stderr\": 0.015096222302469802,\n \"acc_norm\": 0.7150837988826816,\n\
\ \"acc_norm_stderr\": 0.015096222302469802\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8464052287581699,\n \"acc_stderr\": 0.02064559791041877,\n\
\ \"acc_norm\": 0.8464052287581699,\n \"acc_norm_stderr\": 0.02064559791041877\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n\
\ \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n\
\ \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.021613809395224812,\n\
\ \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.021613809395224812\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6170212765957447,\n \"acc_stderr\": 0.028999080904806185,\n \
\ \"acc_norm\": 0.6170212765957447,\n \"acc_norm_stderr\": 0.028999080904806185\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5827900912646675,\n\
\ \"acc_stderr\": 0.012593959992906427,\n \"acc_norm\": 0.5827900912646675,\n\
\ \"acc_norm_stderr\": 0.012593959992906427\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.023709788253811766,\n \
\ \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.023709788253811766\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8088235294117647,\n \"acc_stderr\": 0.015908290136278043,\n \
\ \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.015908290136278043\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8285714285714286,\n \"acc_stderr\": 0.02412746346265016,\n\
\ \"acc_norm\": 0.8285714285714286,\n \"acc_norm_stderr\": 0.02412746346265016\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9104477611940298,\n\
\ \"acc_stderr\": 0.02019067053502792,\n \"acc_norm\": 0.9104477611940298,\n\
\ \"acc_norm_stderr\": 0.02019067053502792\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646613,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646613\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.0261682213446623,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.0261682213446623\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40758873929008566,\n\
\ \"mc1_stderr\": 0.017201949234553107,\n \"mc2\": 0.5708422092704679,\n\
\ \"mc2_stderr\": 0.015184723749426742\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.011508957690722764\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5549658832448825,\n \
\ \"acc_stderr\": 0.0136890115674142\n }\n}\n```"
repo_url: https://huggingface.co/adamo1139/Yi-34B-200K-AEZAKMI-RAW-2301-LoRA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|arc:challenge|25_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|gsm8k|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hellaswag|10_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T16-09-42.767487.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T16-09-42.767487.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- '**/details_harness|winogrande|5_2024-01-27T16-09-42.767487.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-27T16-09-42.767487.parquet'
- config_name: results
data_files:
- split: 2024_01_27T16_09_42.767487
path:
- results_2024-01-27T16-09-42.767487.parquet
- split: latest
path:
- results_2024-01-27T16-09-42.767487.parquet
---
# Dataset Card for Evaluation run of adamo1139/Yi-34B-200K-AEZAKMI-RAW-2301-LoRA
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [adamo1139/Yi-34B-200K-AEZAKMI-RAW-2301-LoRA](https://huggingface.co/adamo1139/Yi-34B-200K-AEZAKMI-RAW-2301-LoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_adamo1139__Yi-34B-200K-AEZAKMI-RAW-2301-LoRA",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-27T16:09:42.767487](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__Yi-34B-200K-AEZAKMI-RAW-2301-LoRA/blob/main/results_2024-01-27T16-09-42.767487.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7413605787370283,
"acc_stderr": 0.02895069135836259,
"acc_norm": 0.7476488274301629,
"acc_norm_stderr": 0.029481162291123596,
"mc1": 0.40758873929008566,
"mc1_stderr": 0.017201949234553107,
"mc2": 0.5708422092704679,
"mc2_stderr": 0.015184723749426742
},
"harness|arc:challenge|25": {
"acc": 0.6262798634812287,
"acc_stderr": 0.014137708601759091,
"acc_norm": 0.659556313993174,
"acc_norm_stderr": 0.01384746051889298
},
"harness|hellaswag|10": {
"acc": 0.6420035849432384,
"acc_stderr": 0.0047843129724954,
"acc_norm": 0.8388767177853017,
"acc_norm_stderr": 0.003668932629672556
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.868421052631579,
"acc_stderr": 0.027508689533549915,
"acc_norm": 0.868421052631579,
"acc_norm_stderr": 0.027508689533549915
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7962264150943397,
"acc_stderr": 0.024790784501775402,
"acc_norm": 0.7962264150943397,
"acc_norm_stderr": 0.024790784501775402
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02628055093284806,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02628055093284806
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.034355680560478746,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.034355680560478746
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.04971358884367406,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.04971358884367406
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7531914893617021,
"acc_stderr": 0.02818544130123409,
"acc_norm": 0.7531914893617021,
"acc_norm_stderr": 0.02818544130123409
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7310344827586207,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.7310344827586207,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0242785680243077,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0242785680243077
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8870967741935484,
"acc_stderr": 0.01800360332586361,
"acc_norm": 0.8870967741935484,
"acc_norm_stderr": 0.01800360332586361
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6502463054187192,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.6502463054187192,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865394,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9242424242424242,
"acc_stderr": 0.018852670234993093,
"acc_norm": 0.9242424242424242,
"acc_norm_stderr": 0.018852670234993093
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527029,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527029
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8051282051282052,
"acc_stderr": 0.020083167595181393,
"acc_norm": 0.8051282051282052,
"acc_norm_stderr": 0.020083167595181393
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476668,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8403361344537815,
"acc_stderr": 0.023793353997528802,
"acc_norm": 0.8403361344537815,
"acc_norm_stderr": 0.023793353997528802
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5033112582781457,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.5033112582781457,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9119266055045872,
"acc_stderr": 0.01215074371948165,
"acc_norm": 0.9119266055045872,
"acc_norm_stderr": 0.01215074371948165
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8970588235294118,
"acc_stderr": 0.02132833757080437,
"acc_norm": 0.8970588235294118,
"acc_norm_stderr": 0.02132833757080437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065498,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065498
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7713004484304933,
"acc_stderr": 0.028188240046929203,
"acc_norm": 0.7713004484304933,
"acc_norm_stderr": 0.028188240046929203
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.030884661089515375,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.030884661089515375
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622804,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622804
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.03145703854306251,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.03145703854306251
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8588957055214724,
"acc_stderr": 0.027351605518389752,
"acc_norm": 0.8588957055214724,
"acc_norm_stderr": 0.027351605518389752
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719098,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719098
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.0339329572976101,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.0339329572976101
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.01553751426325388,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.01553751426325388
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8876117496807152,
"acc_stderr": 0.011294541351216554,
"acc_norm": 0.8876117496807152,
"acc_norm_stderr": 0.011294541351216554
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8208092485549133,
"acc_stderr": 0.020647590029679332,
"acc_norm": 0.8208092485549133,
"acc_norm_stderr": 0.020647590029679332
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7150837988826816,
"acc_stderr": 0.015096222302469802,
"acc_norm": 0.7150837988826816,
"acc_norm_stderr": 0.015096222302469802
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8464052287581699,
"acc_stderr": 0.02064559791041877,
"acc_norm": 0.8464052287581699,
"acc_norm_stderr": 0.02064559791041877
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7845659163987139,
"acc_stderr": 0.023350225475471442,
"acc_norm": 0.7845659163987139,
"acc_norm_stderr": 0.023350225475471442
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.021613809395224812,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.021613809395224812
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6170212765957447,
"acc_stderr": 0.028999080904806185,
"acc_norm": 0.6170212765957447,
"acc_norm_stderr": 0.028999080904806185
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5827900912646675,
"acc_stderr": 0.012593959992906427,
"acc_norm": 0.5827900912646675,
"acc_norm_stderr": 0.012593959992906427
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8125,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.015908290136278043,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.015908290136278043
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8285714285714286,
"acc_stderr": 0.02412746346265016,
"acc_norm": 0.8285714285714286,
"acc_norm_stderr": 0.02412746346265016
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9104477611940298,
"acc_stderr": 0.02019067053502792,
"acc_norm": 0.9104477611940298,
"acc_norm_stderr": 0.02019067053502792
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646613,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646613
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.0261682213446623,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.0261682213446623
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40758873929008566,
"mc1_stderr": 0.017201949234553107,
"mc2": 0.5708422092704679,
"mc2_stderr": 0.015184723749426742
},
"harness|winogrande|5": {
"acc": 0.7868981846882399,
"acc_stderr": 0.011508957690722764
},
"harness|gsm8k|5": {
"acc": 0.5549658832448825,
"acc_stderr": 0.0136890115674142
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
bubl-ai/story_with_synthetic_test_set | ---
license: mit
---
|
freshpearYoon/val_free_6 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 8745352416
num_examples: 9105
download_size: 1361253032
dataset_size: 8745352416
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/k3_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of k3/K3/K3 (Girls' Frontline)
This is the dataset of k3/K3/K3 (Girls' Frontline), containing 24 images and their tags.
The core tags of this character are `breasts, hairband, large_breasts, grey_hair, long_hair, grey_eyes, headband, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 24 | 24.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k3_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 24 | 14.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k3_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 53 | 29.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k3_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 24 | 21.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k3_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 53 | 41.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k3_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/k3_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | 1girl, solo, open_mouth, cleavage, collarbone, blush, looking_at_viewer, navel, sports_bra, simple_background, smile, jacket, sweat, pants, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | open_mouth | cleavage | collarbone | blush | looking_at_viewer | navel | sports_bra | simple_background | smile | jacket | sweat | pants | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------------|:-----------|:-------------|:--------|:--------------------|:--------|:-------------|:--------------------|:--------|:---------|:--------|:--------|:-------------------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
result-kand2-sdxl-wuerst-karlo/86947388 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 168
num_examples: 10
download_size: 1325
dataset_size: 168
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "86947388"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huggingface/autotrain-data-4v9x-3cwh-jsa8 | Invalid username or password. |
autoevaluate/autoeval-staging-eval-project-Blaise-g__SumPubmed-3c512f6e-12265640 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- Blaise-g/SumPubmed
eval_info:
task: summarization
model: Blaise-g/led_large_baseline_pubmed
metrics: ['bertscore']
dataset_name: Blaise-g/SumPubmed
dataset_config: Blaise-g--SumPubmed
dataset_split: test
col_mapping:
text: text
target: abstract
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: Blaise-g/led_large_baseline_pubmed
* Dataset: Blaise-g/SumPubmed
* Config: Blaise-g--SumPubmed
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Blaise-g](https://huggingface.co/Blaise-g) for evaluating this model. |
temiy/cat-simple | ---
license: apache-2.0
---
|
yuejunasia/haaudio | ---
license: other
---
这是一个ha测试数据集! |
bigbio/psytar |
---
language:
- en
bigbio_language:
- English
license: cc-by-4.0
multilinguality: monolingual
bigbio_license_shortname: CC_BY_4p0
pretty_name: PsyTAR
homepage: https://www.askapatient.com/research/pharmacovigilance/corpus-ades-psychiatric-medications.asp
bigbio_pubmed: False
bigbio_public: False
bigbio_tasks:
- NAMED_ENTITY_RECOGNITION
- TEXT_CLASSIFICATION
---
# Dataset Card for PsyTAR
## Dataset Description
- **Homepage:** https://www.askapatient.com/research/pharmacovigilance/corpus-ades-psychiatric-medications.asp
- **Pubmed:** False
- **Public:** False
- **Tasks:** NER,TXTCLASS
The "Psychiatric Treatment Adverse Reactions" (PsyTAR) dataset contains 891 drugs
reviews posted by patients on "askapatient.com", about the effectiveness and adverse
drug events associated with Zoloft, Lexapro, Cymbalta, and Effexor XR.
This dataset can be used for (multi-label) sentence classification of Adverse Drug
Reaction (ADR), Withdrawal Symptoms (WDs), Sign/Symptoms/Illness (SSIs), Drug
Indications (DIs), Drug Effectiveness (EF), Drug Infectiveness (INF) and Others, as well
as for recognition of 5 different types of named entity (in the categories ADRs, WDs,
SSIs and DIs)
## Citation Information
```
@article{Zolnoori2019,
author = {Maryam Zolnoori and
Kin Wah Fung and
Timothy B. Patrick and
Paul Fontelo and
Hadi Kharrazi and
Anthony Faiola and
Yi Shuan Shirley Wu and
Christina E. Eldredge and
Jake Luo and
Mike Conway and
Jiaxi Zhu and
Soo Kyung Park and
Kelly Xu and
Hamideh Moayyed and
Somaieh Goudarzvand},
title = {A systematic approach for developing a corpus of patient reported adverse drug events: A case study for {SSRI} and {SNRI} medications},
journal = {Journal of Biomedical Informatics},
volume = {90},
year = {2019},
url = {https://doi.org/10.1016/j.jbi.2018.12.005},
doi = {10.1016/j.jbi.2018.12.005},
}
```
|
DeepFoldProtein/Foldseek_over70_proteome_UniDoc_test | ---
dataset_info:
features:
- name: uniprotAccession
dtype: string
- name: domain
sequence:
sequence: int64
- name: ndom
dtype: int64
- name: taxId
dtype: string
- name: uniprotSequence
dtype: string
splits:
- name: train
num_bytes: 4002
num_examples: 12
download_size: 7272
dataset_size: 4002
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
alvarochelo/dataset_nautical | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 908645884.0
num_examples: 239
download_size: 875628182
dataset_size: 908645884.0
---
# Dataset Card for "dataset_nautical"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/find_first_sent_train_50_eval_10_recite | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 170837
num_examples: 110
- name: validation
num_bytes: 15661
num_examples: 10
download_size: 0
dataset_size: 186498
---
# Dataset Card for "find_first_sent_train_50_eval_10_recite"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nlp-brin-id/pos_pairs | ---
license: apache-2.0
---
|
wwydmanski/metagenomic_curated | ---
license: artistic-2.0
---
# Metagenomic curated data
This is a python repack of the curated data from the [Metagenomic Data Repository](https://waldronlab.io/curatedMetagenomicData/).
Please refer to the [study list](https://experimenthub.bioconductor.org/package/curatedMetagenomicData) and [study metadata](https://waldronlab.io/curatedMetagenomicData/articles/available-studies.html) for the list of available datasets.
## Sample usage
```python
ds = datasets.load_dataset("wwydmanski/metagenomic_curated", "EH1726")
X = np.array([list(i.values()) for i in ds['train']['features']])
y = np.array([x['study_condition'] for x in ds['train']['metadata']])
```
## Finding a relevant dataset EHID
The easiest way to find an interesting study is via [study metadata](https://waldronlab.io/curatedMetagenomicData/articles/available-studies.html). After that, you can find corresponding EHIDs by referring on the https://experimenthub.bioconductor.org/title/{study_name} page.
Let's say that the ThomasAM_2018a study piqued your curiosity - it means that you will be able to find all relevant datasets on the [https://experimenthub.bioconductor.org/title/ThomasAM_2018a](https://experimenthub.bioconductor.org/title/ThomasAM_2018a) website. |
AdapterOcean/gptindex-standardized_cluster_0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 10931345
num_examples: 1234
download_size: 2937820
dataset_size: 10931345
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gptindex-standardized_cluster_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lipi17/building-cracks | ---
task_categories:
- object-detection
tags:
- roboflow
- roboflow2huggingface
---
<div align="center">
<img width="640" alt="lipi17/building-cracks" src="https://huggingface.co/datasets/lipi17/building-cracks/resolve/main/thumbnail.jpg">
</div>
### Dataset Labels
```
['crack']
```
### Number of Images
```json
{'valid': 433, 'test': 211, 'train': 1490}
```
### How to Use
- Install [datasets](https://pypi.org/project/datasets/):
```bash
pip install datasets
```
- Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("lipi17/building-cracks", name="full")
example = ds['train'][0]
```
### Roboflow Dataset Page
[https://universe.roboflow.com/antonio-raimundo/crack-detection-y5kyg/dataset/2](https://universe.roboflow.com/antonio-raimundo/crack-detection-y5kyg/dataset/2?ref=roboflow2huggingface)
### Citation
```
@misc{ crack-detection-y5kyg_dataset,
title = { Crack Detection Dataset },
type = { Open Source Dataset },
author = { António Raimundo },
howpublished = { \\url{ https://universe.roboflow.com/antonio-raimundo/crack-detection-y5kyg } },
url = { https://universe.roboflow.com/antonio-raimundo/crack-detection-y5kyg },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2023 },
month = { feb },
note = { visited on 2023-10-21 },
}
```
### License
CC BY 4.0
### Dataset Summary
This dataset was exported via roboflow.com on February 10, 2023 at 3:51 PM GMT
Roboflow is an end-to-end computer vision platform that helps you
* collaborate with your team on computer vision projects
* collect & organize images
* understand and search unstructured image data
* annotate, and create datasets
* export, train, and deploy computer vision models
* use active learning to improve your dataset over time
For state of the art Computer Vision training notebooks you can use with this dataset,
visit https://github.com/roboflow/notebooks
To find over 100k other datasets and pre-trained models, visit https://universe.roboflow.com
The dataset includes 2134 images.
Soil are annotated in COCO format.
The following pre-processing was applied to each image:
* Auto-orientation of pixel data (with EXIF-orientation stripping)
* Resize to 640x640 (Stretch)
No image augmentation techniques were applied.
|
vmathur87/llm-support | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 717288.0
num_examples: 242
- name: test
num_bytes: 59280.0
num_examples: 20
download_size: 338197
dataset_size: 776568.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Shularp/Process_tested-Shularp-Process_tested-facebook-floresarb_Arab_to_eng_Latn | ---
dataset_info:
features:
- name: translation
struct:
- name: ar
dtype: string
- name: en
dtype: string
- name: id
sequence: int64
splits:
- name: train
num_bytes: 361758
num_examples: 997
- name: test
num_bytes: 379791
num_examples: 1012
download_size: 412821
dataset_size: 741549
---
# Dataset Card for "Process_tested-Shularp-Process_tested-facebook-floresarb_Arab_to_eng_Latn"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
atiranela/rodrigo | ---
license: openrail
---
|
chishiki-ai/cnn-course | ---
license: cc-by-4.0
---
|
Lhaippp/DMHomo | ---
license: mit
---
|
open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Base-test-WVG | ---
pretty_name: Evaluation run of LTC-AI-Labs/L2-7b-Base-test-WVG
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LTC-AI-Labs/L2-7b-Base-test-WVG](https://huggingface.co/LTC-AI-Labs/L2-7b-Base-test-WVG)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Base-test-WVG\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T07:52:27.189086](https://huggingface.co/datasets/open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Base-test-WVG/blob/main/results_2023-10-28T07-52-27.189086.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.003145973154362416,\n\
\ \"em_stderr\": 0.0005734993648436373,\n \"f1\": 0.07481229026845672,\n\
\ \"f1_stderr\": 0.0016422896702234556,\n \"acc\": 0.40267285313968093,\n\
\ \"acc_stderr\": 0.00970555723399882\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.003145973154362416,\n \"em_stderr\": 0.0005734993648436373,\n\
\ \"f1\": 0.07481229026845672,\n \"f1_stderr\": 0.0016422896702234556\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06974981046247157,\n \
\ \"acc_stderr\": 0.007016389571013843\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7355958958168903,\n \"acc_stderr\": 0.012394724896983799\n\
\ }\n}\n```"
repo_url: https://huggingface.co/LTC-AI-Labs/L2-7b-Base-test-WVG
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T06_36_16.797528
path:
- '**/details_harness|drop|3_2023-10-28T06-36-16.797528.parquet'
- split: 2023_10_28T07_52_27.189086
path:
- '**/details_harness|drop|3_2023-10-28T07-52-27.189086.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T07-52-27.189086.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T06_36_16.797528
path:
- '**/details_harness|gsm8k|5_2023-10-28T06-36-16.797528.parquet'
- split: 2023_10_28T07_52_27.189086
path:
- '**/details_harness|gsm8k|5_2023-10-28T07-52-27.189086.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T07-52-27.189086.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-19-09.186622.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-19-09.186622.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-19-09.186622.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T06_36_16.797528
path:
- '**/details_harness|winogrande|5_2023-10-28T06-36-16.797528.parquet'
- split: 2023_10_28T07_52_27.189086
path:
- '**/details_harness|winogrande|5_2023-10-28T07-52-27.189086.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T07-52-27.189086.parquet'
- config_name: results
data_files:
- split: 2023_10_03T17_19_09.186622
path:
- results_2023-10-03T17-19-09.186622.parquet
- split: 2023_10_28T06_36_16.797528
path:
- results_2023-10-28T06-36-16.797528.parquet
- split: 2023_10_28T07_52_27.189086
path:
- results_2023-10-28T07-52-27.189086.parquet
- split: latest
path:
- results_2023-10-28T07-52-27.189086.parquet
---
# Dataset Card for Evaluation run of LTC-AI-Labs/L2-7b-Base-test-WVG
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/LTC-AI-Labs/L2-7b-Base-test-WVG
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [LTC-AI-Labs/L2-7b-Base-test-WVG](https://huggingface.co/LTC-AI-Labs/L2-7b-Base-test-WVG) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Base-test-WVG",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T07:52:27.189086](https://huggingface.co/datasets/open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Base-test-WVG/blob/main/results_2023-10-28T07-52-27.189086.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.003145973154362416,
"em_stderr": 0.0005734993648436373,
"f1": 0.07481229026845672,
"f1_stderr": 0.0016422896702234556,
"acc": 0.40267285313968093,
"acc_stderr": 0.00970555723399882
},
"harness|drop|3": {
"em": 0.003145973154362416,
"em_stderr": 0.0005734993648436373,
"f1": 0.07481229026845672,
"f1_stderr": 0.0016422896702234556
},
"harness|gsm8k|5": {
"acc": 0.06974981046247157,
"acc_stderr": 0.007016389571013843
},
"harness|winogrande|5": {
"acc": 0.7355958958168903,
"acc_stderr": 0.012394724896983799
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kpriyanshu256/MultiTabQA-multitable_pretraining-valid-v2 | ---
dataset_info:
features:
- name: query
dtype: string
- name: table_names
sequence: string
- name: tables
sequence: string
- name: answer
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: source_latex
dtype: string
- name: target_latex
dtype: string
- name: source_html
dtype: string
- name: target_html
dtype: string
- name: source_markdown
dtype: string
- name: target_markdown
dtype: string
splits:
- name: train
num_bytes: 853236383
num_examples: 536
download_size: 207209865
dataset_size: 853236383
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
w95/fin | ---
configs:
- config_name: default
data_files:
- split: train
path: train.jsonl
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
---
|
Deojoandco/capstone_fromgpt_without_gold | ---
dataset_info:
features:
- name: dialog_id
dtype: int64
- name: dialogue
dtype: string
- name: summary
dtype: string
- name: gold_tags
dtype: string
- name: query
dtype: string
- name: gpt_success
dtype: bool
- name: gpt_response
dtype: string
- name: gold_tags_tokens_count
dtype: int64
- name: GPT_OUTPUT_FOUND
dtype: bool
- name: gpt_output_tags
dtype: string
- name: gpt_output_tag_tokens
dtype: int64
- name: summary_gpt_token_count_match
dtype: bool
- name: gpt_output_token_count
dtype: int64
- name: gpt_output_tag_count
dtype: int64
- name: summary_gpt_tags_token_count_match
dtype: bool
splits:
- name: test
num_bytes: 57588
num_examples: 12
download_size: 30674
dataset_size: 57588
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "capstone_fromgpt_without_gold"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Abhinav-B/finetune_llama_wikisql | ---
dataset_info:
features:
- name: formatted_text
dtype: string
splits:
- name: train
num_bytes: 1530908
num_examples: 10000
download_size: 703398
dataset_size: 1530908
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dexhrestha/annomi-sample | ---
dataset_info:
features:
- name: text
dtype: string
- name: len
dtype: int64
splits:
- name: train
num_bytes: 734843.2554793004
num_examples: 7461
- name: validation
num_bytes: 81747.74186406907
num_examples: 830
download_size: 611225
dataset_size: 816590.9973433695
---
# Dataset Card for "annomi-sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sjonas50/test | ---
license: creativeml-openrail-m
---
|
Ironov/abstract | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1245046.0
num_examples: 409
download_size: 1254490
dataset_size: 1245046.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Rowan/hellaswag | ---
language:
- en
paperswithcode_id: hellaswag
pretty_name: HellaSwag
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 43232624
num_examples: 39905
- name: test
num_bytes: 10791853
num_examples: 10003
- name: validation
num_bytes: 11175717
num_examples: 10042
download_size: 71494896
dataset_size: 65200194
---
# Dataset Card for "hellaswag"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://rowanzellers.com/hellaswag/](https://rowanzellers.com/hellaswag/)
- **Repository:** [https://github.com/rowanz/hellaswag/](https://github.com/rowanz/hellaswag/)
- **Paper:** [HellaSwag: Can a Machine Really Finish Your Sentence?](https://arxiv.org/abs/1905.07830)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 71.49 MB
- **Size of the generated dataset:** 65.32 MB
- **Total amount of disk used:** 136.81 MB
### Dataset Summary
HellaSwag: Can a Machine Really Finish Your Sentence? is a new dataset for commonsense NLI. A paper was published at ACL2019.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### default
- **Size of downloaded dataset files:** 71.49 MB
- **Size of the generated dataset:** 65.32 MB
- **Total amount of disk used:** 136.81 MB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"activity_label": "Removing ice from car",
"ctx": "Then, the man writes over the snow covering the window of a car, and a woman wearing winter clothes smiles. then",
"ctx_a": "Then, the man writes over the snow covering the window of a car, and a woman wearing winter clothes smiles.",
"ctx_b": "then",
"endings": "[\", the man adds wax to the windshield and cuts it.\", \", a person board a ski lift, while two men supporting the head of the per...",
"ind": 4,
"label": "3",
"source_id": "activitynet~v_-1IBHYS3L-Y",
"split": "train",
"split_type": "indomain"
}
```
### Data Fields
The data fields are the same among all splits.
#### default
- `ind`: a `int32` feature.
- `activity_label`: a `string` feature.
- `ctx_a`: a `string` feature.
- `ctx_b`: a `string` feature.
- `ctx`: a `string` feature.
- `endings`: a `list` of `string` features.
- `source_id`: a `string` feature.
- `split`: a `string` feature.
- `split_type`: a `string` feature.
- `label`: a `string` feature.
### Data Splits
| name |train|validation|test |
|-------|----:|---------:|----:|
|default|39905| 10042|10003|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
MIT https://github.com/rowanz/hellaswag/blob/master/LICENSE
### Citation Information
```
@inproceedings{zellers2019hellaswag,
title={HellaSwag: Can a Machine Really Finish Your Sentence?},
author={Zellers, Rowan and Holtzman, Ari and Bisk, Yonatan and Farhadi, Ali and Choi, Yejin},
booktitle ={Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics},
year={2019}
}
```
### Contributions
Thanks to [@albertvillanova](https://github.com/albertvillanova), [@mariamabarham](https://github.com/mariamabarham), [@thomwolf](https://github.com/thomwolf), [@patrickvonplaten](https://github.com/patrickvonplaten), [@lewtun](https://github.com/lewtun) for adding this dataset. |
ag2428/reasoningDataV4 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: answer
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 2481669221
num_examples: 2062854
download_size: 1500063761
dataset_size: 2481669221
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "reasoningDataV4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
paoloitaliani/news_articles | ---
dataset_info:
- config_name: corriere_autunno
features:
- name: author
dtype: string
- name: journal
dtype: string
- name: body
dtype: string
- name: date
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 339578
num_examples: 90
download_size: 237083
dataset_size: 339578
- config_name: corriere_primavera
features:
- name: author
dtype: string
- name: journal
dtype: string
- name: body
dtype: string
- name: date
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 319422
num_examples: 105
download_size: 206264
dataset_size: 319422
- config_name: fattoq_autunno
features:
- name: author
dtype: string
- name: journal
dtype: string
- name: body
dtype: string
- name: date
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 519012
num_examples: 133
download_size: 338948
dataset_size: 519012
- config_name: fattoq_primavera
features:
- name: author
dtype: string
- name: journal
dtype: string
- name: body
dtype: string
- name: date
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 508621
num_examples: 152
download_size: 331977
dataset_size: 508621
- config_name: ukraine
features:
- name: date
dtype: timestamp[ns]
- name: body
dtype: string
- name: author
dtype: string
- name: journal
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 81923456
num_examples: 27449
download_size: 0
dataset_size: 81923456
configs:
- config_name: corriere_autunno
data_files:
- split: train
path: corriere_autunno/train-*
- config_name: corriere_primavera
data_files:
- split: train
path: corriere_primavera/train-*
- config_name: fattoq_autunno
data_files:
- split: train
path: fattoq_autunno/train-*
- config_name: fattoq_primavera
data_files:
- split: train
path: fattoq_primavera/train-*
- config_name: ukraine
data_files:
- split: train
path: ukraine/train-*
---
# Dataset Card for "news_articles"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nirantk/french-books | ---
dataset_info:
features:
- name: file_id
dtype: string
- name: ocr
dtype: string
- name: title
dtype: string
- name: date
dtype: string
- name: author
dtype: string
- name: page_count
dtype: int64
- name: word_count
dtype: int64
- name: character_count
dtype: int64
- name: complete_text
dtype: string
splits:
- name: train
num_bytes: 427823810
num_examples: 1101
download_size: 246984465
dataset_size: 427823810
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nlpai-lab/kullm-v2 | ---
license: apache-2.0
task_categories:
- text-generation
language:
- ko
pretty_name: kullm
size_categories:
- 10K<n<100K
---
# Dataset Card for "KULLM-v2"
## Dataset Summary
Korean translation of GPT4ALL, Dolly, and Vicuna data.
repository: [nlpai-lab/KULLM](https://github.com/nlpai-lab/KULLM)
huggingface: [nlpai-lab/kullm-v2](https://huggingface.co/nlpai-lab/kullm-polyglot-12.8b-v2)
#### Translate dataset
Translated 'instruction', 'input', and 'output' in the dataset via the DeepL API
## Lisence
Apache-2.0
```python
>>> from datasets import load_dataset
>>> ds = load_dataset("nlpai-lab/kullm-v2", split="train")
>>> ds
DatasetDict({
train: Dataset({
features: ['id', 'instruction', 'input', 'output'],
num_rows: 152630
})
})
```
```python
>>> ds[0]
{'id': 'alpaca_{idx}',
'instruction': '3원색이란 무엇인가요?',
'input': '',
'output': '세 가지 기본 색은 빨강, 파랑, 노랑입니다. 이 색은 다른 색을 혼합하여 만들 수 없고 다른 모든 색은 다양한 비율로 조합하여 만들 수 있기 때문에 원색이라고 부릅니다. 빛에 사용되는 첨가제 색상 시스템에서 원색은 빨강, 녹색, 파랑(RGB)입니다.'}
``` |
open-llm-leaderboard/details_UCLA-AGI__test_final | ---
pretty_name: Evaluation run of UCLA-AGI/test_final
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [UCLA-AGI/test_final](https://huggingface.co/UCLA-AGI/test_final) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_UCLA-AGI__test_final\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-13T15:52:58.260309](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__test_final/blob/main/results_2024-01-13T15-52-58.260309.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6144035496773548,\n\
\ \"acc_stderr\": 0.032858739117399755,\n \"acc_norm\": 0.6200519616024565,\n\
\ \"acc_norm_stderr\": 0.03352475225298005,\n \"mc1\": 0.4222766217870257,\n\
\ \"mc1_stderr\": 0.017290733254248174,\n \"mc2\": 0.5789464689775264,\n\
\ \"mc2_stderr\": 0.015807009741465705\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6305460750853242,\n \"acc_stderr\": 0.014104578366491887,\n\
\ \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.01383056892797433\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.676458872734515,\n\
\ \"acc_stderr\": 0.0046687106891924,\n \"acc_norm\": 0.8584943238398726,\n\
\ \"acc_norm_stderr\": 0.0034783009945146973\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432115,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432115\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n\
\ \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n\
\ \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n\
\ \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n \"\
acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7322580645161291,\n \"acc_stderr\": 0.02518900666021238,\n \"\
acc_norm\": 0.7322580645161291,\n \"acc_norm_stderr\": 0.02518900666021238\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709447,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709447\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647074,\n\
\ \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647074\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7944954128440367,\n \"acc_stderr\": 0.01732435232501601,\n \"\
acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.01732435232501601\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808514,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808514\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489284,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489284\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917212,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917212\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3664804469273743,\n\
\ \"acc_stderr\": 0.016115235504865464,\n \"acc_norm\": 0.3664804469273743,\n\
\ \"acc_norm_stderr\": 0.016115235504865464\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n\
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n\
\ \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n\
\ \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.02651759772446501,\n\
\ \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.02651759772446501\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303055,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303055\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44589308996088656,\n\
\ \"acc_stderr\": 0.012695244711379778,\n \"acc_norm\": 0.44589308996088656,\n\
\ \"acc_norm_stderr\": 0.012695244711379778\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.619281045751634,\n \"acc_stderr\": 0.0196438015579248,\n \
\ \"acc_norm\": 0.619281045751634,\n \"acc_norm_stderr\": 0.0196438015579248\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n\
\ \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4222766217870257,\n\
\ \"mc1_stderr\": 0.017290733254248174,\n \"mc2\": 0.5789464689775264,\n\
\ \"mc2_stderr\": 0.015807009741465705\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183525\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3419257012888552,\n \
\ \"acc_stderr\": 0.0130660896251828\n }\n}\n```"
repo_url: https://huggingface.co/UCLA-AGI/test_final
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|arc:challenge|25_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|gsm8k|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hellaswag|10_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T15-52-58.260309.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T15-52-58.260309.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- '**/details_harness|winogrande|5_2024-01-13T15-52-58.260309.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-13T15-52-58.260309.parquet'
- config_name: results
data_files:
- split: 2024_01_13T15_52_58.260309
path:
- results_2024-01-13T15-52-58.260309.parquet
- split: latest
path:
- results_2024-01-13T15-52-58.260309.parquet
---
# Dataset Card for Evaluation run of UCLA-AGI/test_final
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [UCLA-AGI/test_final](https://huggingface.co/UCLA-AGI/test_final) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_UCLA-AGI__test_final",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T15:52:58.260309](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__test_final/blob/main/results_2024-01-13T15-52-58.260309.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6144035496773548,
"acc_stderr": 0.032858739117399755,
"acc_norm": 0.6200519616024565,
"acc_norm_stderr": 0.03352475225298005,
"mc1": 0.4222766217870257,
"mc1_stderr": 0.017290733254248174,
"mc2": 0.5789464689775264,
"mc2_stderr": 0.015807009741465705
},
"harness|arc:challenge|25": {
"acc": 0.6305460750853242,
"acc_stderr": 0.014104578366491887,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.01383056892797433
},
"harness|hellaswag|10": {
"acc": 0.676458872734515,
"acc_stderr": 0.0046687106891924,
"acc_norm": 0.8584943238398726,
"acc_norm_stderr": 0.0034783009945146973
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432115,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432115
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.02518900666021238,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.02518900666021238
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709447,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709447
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647074,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647074
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.01732435232501601,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.01732435232501601
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808514,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808514
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489284,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489284
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917212,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917212
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3664804469273743,
"acc_stderr": 0.016115235504865464,
"acc_norm": 0.3664804469273743,
"acc_norm_stderr": 0.016115235504865464
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.02651759772446501,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.02651759772446501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303055,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303055
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44589308996088656,
"acc_stderr": 0.012695244711379778,
"acc_norm": 0.44589308996088656,
"acc_norm_stderr": 0.012695244711379778
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.619281045751634,
"acc_stderr": 0.0196438015579248,
"acc_norm": 0.619281045751634,
"acc_norm_stderr": 0.0196438015579248
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786845,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786845
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4222766217870257,
"mc1_stderr": 0.017290733254248174,
"mc2": 0.5789464689775264,
"mc2_stderr": 0.015807009741465705
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183525
},
"harness|gsm8k|5": {
"acc": 0.3419257012888552,
"acc_stderr": 0.0130660896251828
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
DarthReca/quakeset | ---
license: openrail
task_categories:
- image-classification
- image-segmentation
tags:
- climate
pretty_name: QuakeSet
size_categories:
- 1K<n<10K
---
# Dataset Card for QuakeSet
QuakeSet is a dataset to analyze different attributes of earthquakes. It contains bi-temporal time series of images and ground truth annotations for magnitudes, hypocenters, and affected areas.
- **PrePrint:** https://arxiv.org/abs/2403.18116
## Dataset Details
### Dataset Description
The images are taken from the Sentinel-1 mission using the Interferometric Wide swath mode.
The International Seismological Centre provides information about earthquakes.
- **License:** OPENRAIL
## Dataset Structure
The dataset is divided into three folds with equal distribution of magnitudes and balanced in positive and negative examples.
Each sample contains:
- an image
- x,y coordinates
- epsg of the coordinates
- affected label
- magnitude
## Citation
**BibTeX:**
```
@article{quakeset,
title={QuakeSet: A Dataset and Low-Resource Models to Monitor Earthquakes through Sentinel-1},
author={Daniele Rege Cambrin and Paolo Garza},
journal={Proceedings of the 21th International Conference on Information Systems for Crisis Response and Management},
year={2024},
}
``` |
hazyresearch/fda | ---
dataset_info:
features:
- name: doc_id
dtype: string
- name: file_name
dtype: string
- name: key
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: validaion
num_bytes: 8498008
num_examples: 1102
download_size: 1381388
dataset_size: 8498008
configs:
- config_name: default
data_files:
- split: validaion
path: data/validaion-*
---
|
Fizzarolli/bluemoon_processeed | ---
task_categories:
- text-generation
source_datasets: PJMixers/grimulkan_bluemoon_Karen_cleaned-carded
---
preprocessed version of [PJMixers/grimulkan_bluemoon_Karen_cleaned-carded](https://huggingface.co/datasets/PJMixers/grimulkan_bluemoon_Karen_cleaned-carded) into a fun little prompt format for finetuning |
xwjzds/pretrain_punctuation | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2388642394
num_examples: 631331
download_size: 1485646531
dataset_size: 2388642394
---
# Dataset Card for "pretrain_punctuation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aisuko/funsd-layoutlmv3 | ---
license: apache-2.0
---
Original from: https://huggingface.co/datasets/aisuko/funsd-layoutlmv3
Adaptered by: Aisuko
License: Apache-2.0
```python
dataset = load_dataset("aisuko/funsd-layoutlmv3")
# check the dataset
dataset
# check the features
dataset["train"].features
# check the first example
example=dataset["train"][0]
example["image"]
``` |
projectbaraat/hindi-translation-data-v0.1 | ---
dataset_info:
features:
- name: src
dtype: string
- name: tgt
dtype: string
splits:
- name: train
num_bytes: 3880301105
num_examples: 10572313
download_size: 2054879767
dataset_size: 3880301105
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distilled-from-one-sec-cv12/chunk_16 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1245592852
num_examples: 242711
download_size: 1269320601
dataset_size: 1245592852
---
# Dataset Card for "chunk_16"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rb05751/reuters_articles | ---
license: cc
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: title
dtype: string
- name: body
dtype: string
splits:
- name: train
num_bytes: 12503434
num_examples: 15000
- name: validation
num_bytes: 4272675
num_examples: 5000
- name: test
num_bytes: 1709070
num_examples: 2000
download_size: 10790292
dataset_size: 18485179
---
|
Twenty1/aws-lambda-developer-guide-docs | ---
license: openrail
---
|
Sitedemerde/projet | ---
license: other
license_name: jesaispas
license_link: LICENSE
---
|
Ehraim/SequentialLearnerv3 | ---
license: apache-2.0
---
|
cheafdevo56/Influential_CitedNegs_1_Percent | ---
dataset_info:
features:
- name: query
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: title
dtype: string
- name: pos
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: title
dtype: string
- name: neg
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: score
dtype: int64
- name: title
dtype: string
splits:
- name: train
num_bytes: 173163283.2
num_examples: 45000
- name: validation
num_bytes: 19240364.8
num_examples: 5000
download_size: 115634444
dataset_size: 192403648.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
shibing624/huatuo_medical_qa_sharegpt | ---
license: apache-2.0
---
source:
- https://huggingface.co/datasets/FreedomIntelligence/HuatuoGPT-sft-data-v1
- https://huggingface.co/datasets/FreedomIntelligence/HuatuoGPT2_sft_instruct_GPT4_50K
转为sharegpt格式,jsonl文件。
data size:
```
> wc -l HuatuoGPT_sft_data_v1_sharegpt.jsonl
226042 HuatuoGPT_sft_data_v1_sharegpt.jsonl
> wc -l HuatuoGPT2_sft_instruct_GPT4_sharegpt.jsonl
50000 HuatuoGPT2_sft_instruct_GPT4_sharegpt.jsonl
```
转换代码:convert.py
```python
import json
# 假设您的JSONL文件名为 'input.jsonl'
input_file = './HuatuoGPT2_sft_instruct_GPT4.jsonl'
output_file = './HuatuoGPT2_sft_instruct_GPT4_sharegpt.jsonl'
# 初始化输出文件
with open(input_file, 'r', encoding='utf-8') as infile, open(output_file, 'w', encoding='utf-8') as outfile:
# 初始化输出的JSON结构
# 逐行读取JSONL文件
for id,line in enumerate(infile):
output_json = {"conversations": []}
# 解析JSON对象
data = json.loads(line.strip())
# if id > 10:
# break
# 假设每个JSON对象都有一个"data"列表,包含问题和答案
for i, item in enumerate(data['data']):
if i % 2 == 0: # 假设问题在偶数位置,答案在奇数位置
output_json['conversations'].append({
"from": "human",
"value": item[2:]
})
else:
output_json['conversations'].append({
"from": "gpt",
"value": item[2:]
})
# 将转换后的JSON写入文件
a = json.dumps(output_json, ensure_ascii=False)
outfile.write(a + '\n')
print(f"Conversion complete. Output saved to '{output_file}'.")
``` |
AlexanderBenady/generated_lectures | ---
license: unknown
---
|
ArjjunS/Pantaloons_public | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 4229185.0
num_examples: 75
download_size: 4206323
dataset_size: 4229185.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_mnli_his_he | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 125800
num_examples: 584
- name: dev_mismatched
num_bytes: 126834
num_examples: 518
- name: test_matched
num_bytes: 141846
num_examples: 661
- name: test_mismatched
num_bytes: 121135
num_examples: 491
- name: train
num_bytes: 5412418
num_examples: 25033
download_size: 3767461
dataset_size: 5928033
---
# Dataset Card for "MULTI_VALUE_mnli_his_he"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
up-the-brics/raw-chatgpt | ---
license: apache-2.0
---
|
afrikaans_ner_corpus | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- af
license:
- other
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: Afrikaans Ner Corpus
license_details: Creative Commons Attribution 2.5 South Africa License
dataset_info:
config_name: afrikaans_ner_corpus
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': OUT
'1': B-PERS
'2': I-PERS
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-MISC
'8': I-MISC
splits:
- name: train
num_bytes: 4025651
num_examples: 8962
download_size: 944804
dataset_size: 4025651
configs:
- config_name: afrikaans_ner_corpus
data_files:
- split: train
path: afrikaans_ner_corpus/train-*
default: true
---
# Dataset Card for Afrikaans Ner Corpus
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Afrikaans Ner Corpus Homepage](https://repo.sadilar.org/handle/20.500.12185/299)
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** [Martin Puttkammer](mailto:Martin.Puttkammer@nwu.ac.za)
### Dataset Summary
The Afrikaans Ner Corpus is an Afrikaans dataset developed by [The Centre for Text Technology (CTexT), North-West University, South Africa](http://humanities.nwu.ac.za/ctext). The data is based on documents from the South African goverment domain and crawled from gov.za websites. It was created to support NER task for Afrikaans language. The dataset uses CoNLL shared task annotation standards.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The language supported is Afrikaans.
## Dataset Structure
### Data Instances
A data point consists of sentences seperated by empty line and tab-seperated tokens and tags.
{'id': '0',
'ner_tags': [0, 0, 0, 0, 0],
'tokens': ['Vertaling', 'van', 'die', 'inligting', 'in']
}
### Data Fields
- `id`: id of the sample
- `tokens`: the tokens of the example text
- `ner_tags`: the NER tags of each token
The NER tags correspond to this list:
```
"OUT", "B-PERS", "I-PERS", "B-ORG", "I-ORG", "B-LOC", "I-LOC", "B-MISC", "I-MISC",
```
The NER tags have the same format as in the CoNLL shared task: a B denotes the first item of a phrase and an I any non-initial word. There are four types of phrases: person names (PER), organizations (ORG), locations (LOC) and miscellaneous names (MISC). (OUT) is used for tokens not considered part of any named entity.
### Data Splits
The data was not split.
## Dataset Creation
### Curation Rationale
The data was created to help introduce resources to new language - Afrikaans.
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
The data is based on South African government domain and was crawled from gov.za websites.
[More Information Needed]
#### Who are the source language producers?
The data was produced by writers of South African government websites - gov.za
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
The data was annotated during the NCHLT text resource development project.
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
The annotated data sets were developed by the Centre for Text Technology (CTexT, North-West University, South Africa).
See: [more information](http://www.nwu.ac.za/ctext)
### Licensing Information
The data is under the [Creative Commons Attribution 2.5 South Africa License](http://creativecommons.org/licenses/by/2.5/za/legalcode)
### Citation Information
```
@inproceedings{afrikaans_ner_corpus,
author = { Gerhard van Huyssteen and
Martin Puttkammer and
E.B. Trollip and
J.C. Liversage and
Roald Eiselen},
title = {NCHLT Afrikaans Named Entity Annotated Corpus},
booktitle = {Eiselen, R. 2016. Government domain named entity recognition for South African languages. Proceedings of the 10th Language Resource and Evaluation Conference, Portorož, Slovenia.},
year = {2016},
url = {https://repo.sadilar.org/handle/20.500.12185/299},
}
```
### Contributions
Thanks to [@yvonnegitau](https://github.com/yvonnegitau) for adding this dataset. |
orgcatorg/megawika | ---
dataset_info:
- config_name: my
features:
- name: article_title
dtype: string
- name: article_text
dtype: string
- name: entries
list:
- name: id
dtype: string
- name: original
dtype: string
- name: original_sents
sequence: string
- name: parse_tokens
sequence:
sequence: string
- name: passage
struct:
- name: en_lang_token_map
struct:
- name: '0'
dtype: int64
- name: '1'
dtype: int64
- name: '10'
dtype: int64
- name: '100'
dtype: int64
- name: '101'
dtype: int64
- name: '102'
dtype: int64
- name: '103'
dtype: int64
- name: '104'
dtype: int64
- name: '105'
dtype: int64
- name: '106'
dtype: int64
- name: '107'
dtype: int64
- name: '108'
dtype: int64
- name: '109'
dtype: int64
- name: '11'
dtype: int64
- name: '110'
dtype: int64
- name: '111'
dtype: int64
- name: '112'
dtype: int64
- name: '113'
dtype: int64
- name: '114'
dtype: int64
- name: '115'
dtype: int64
- name: '116'
dtype: int64
- name: '117'
dtype: int64
- name: '118'
dtype: int64
- name: '119'
dtype: int64
- name: '12'
dtype: int64
- name: '120'
dtype: int64
- name: '121'
dtype: int64
- name: '122'
dtype: int64
- name: '123'
dtype: int64
- name: '124'
dtype: int64
- name: '125'
dtype: int64
- name: '126'
dtype: int64
- name: '127'
dtype: int64
- name: '128'
dtype: int64
- name: '129'
dtype: int64
- name: '13'
dtype: int64
- name: '130'
dtype: int64
- name: '131'
dtype: int64
- name: '132'
dtype: int64
- name: '133'
dtype: int64
- name: '134'
dtype: int64
- name: '135'
dtype: int64
- name: '136'
dtype: int64
- name: '137'
dtype: int64
- name: '138'
dtype: int64
- name: '139'
dtype: int64
- name: '14'
dtype: int64
- name: '140'
dtype: int64
- name: '141'
dtype: int64
- name: '142'
dtype: int64
- name: '143'
dtype: int64
- name: '144'
dtype: int64
- name: '145'
dtype: int64
- name: '146'
dtype: int64
- name: '147'
dtype: int64
- name: '148'
dtype: int64
- name: '149'
dtype: int64
- name: '15'
dtype: int64
- name: '150'
dtype: int64
- name: '151'
dtype: int64
- name: '152'
dtype: int64
- name: '153'
dtype: int64
- name: '154'
dtype: int64
- name: '155'
dtype: int64
- name: '156'
dtype: int64
- name: '157'
dtype: int64
- name: '158'
dtype: int64
- name: '159'
dtype: int64
- name: '16'
dtype: int64
- name: '160'
dtype: int64
- name: '161'
dtype: int64
- name: '162'
dtype: int64
- name: '163'
dtype: int64
- name: '164'
dtype: int64
- name: '165'
dtype: int64
- name: '166'
dtype: int64
- name: '167'
dtype: int64
- name: '168'
dtype: int64
- name: '169'
dtype: int64
- name: '17'
dtype: int64
- name: '170'
dtype: int64
- name: '171'
dtype: int64
- name: '172'
dtype: int64
- name: '173'
dtype: int64
- name: '174'
dtype: int64
- name: '175'
dtype: int64
- name: '176'
dtype: int64
- name: '177'
dtype: int64
- name: '178'
dtype: int64
- name: '179'
dtype: int64
- name: '18'
dtype: int64
- name: '180'
dtype: int64
- name: '181'
dtype: int64
- name: '182'
dtype: int64
- name: '183'
dtype: int64
- name: '184'
dtype: int64
- name: '185'
dtype: int64
- name: '186'
dtype: int64
- name: '187'
dtype: int64
- name: '188'
dtype: int64
- name: '189'
dtype: int64
- name: '19'
dtype: int64
- name: '190'
dtype: int64
- name: '191'
dtype: int64
- name: '192'
dtype: int64
- name: '193'
dtype: int64
- name: '194'
dtype: int64
- name: '195'
dtype: int64
- name: '196'
dtype: int64
- name: '197'
dtype: int64
- name: '198'
dtype: int64
- name: '199'
dtype: int64
- name: '2'
dtype: int64
- name: '20'
dtype: int64
- name: '200'
dtype: int64
- name: '201'
dtype: int64
- name: '202'
dtype: int64
- name: '203'
dtype: int64
- name: '204'
dtype: int64
- name: '205'
dtype: int64
- name: '206'
dtype: int64
- name: '207'
dtype: int64
- name: '208'
dtype: int64
- name: '209'
dtype: int64
- name: '21'
dtype: int64
- name: '210'
dtype: int64
- name: '211'
dtype: int64
- name: '212'
dtype: int64
- name: '213'
dtype: int64
- name: '214'
dtype: int64
- name: '215'
dtype: int64
- name: '216'
dtype: int64
- name: '217'
dtype: int64
- name: '218'
dtype: int64
- name: '219'
dtype: int64
- name: '22'
dtype: int64
- name: '220'
dtype: int64
- name: '221'
dtype: int64
- name: '222'
dtype: int64
- name: '223'
dtype: int64
- name: '224'
dtype: int64
- name: '225'
dtype: int64
- name: '226'
dtype: int64
- name: '227'
dtype: int64
- name: '228'
dtype: int64
- name: '229'
dtype: int64
- name: '23'
dtype: int64
- name: '230'
dtype: int64
- name: '231'
dtype: int64
- name: '232'
dtype: int64
- name: '233'
dtype: int64
- name: '234'
dtype: int64
- name: '235'
dtype: int64
- name: '236'
dtype: int64
- name: '237'
dtype: int64
- name: '238'
dtype: int64
- name: '239'
dtype: int64
- name: '24'
dtype: int64
- name: '240'
dtype: int64
- name: '241'
dtype: int64
- name: '242'
dtype: int64
- name: '243'
dtype: int64
- name: '244'
dtype: int64
- name: '245'
dtype: int64
- name: '246'
dtype: int64
- name: '247'
dtype: int64
- name: '248'
dtype: int64
- name: '249'
dtype: int64
- name: '25'
dtype: int64
- name: '250'
dtype: int64
- name: '251'
dtype: int64
- name: '252'
dtype: int64
- name: '253'
dtype: int64
- name: '254'
dtype: int64
- name: '255'
dtype: int64
- name: '256'
dtype: int64
- name: '257'
dtype: int64
- name: '258'
dtype: int64
- name: '259'
dtype: int64
- name: '26'
dtype: int64
- name: '260'
dtype: int64
- name: '261'
dtype: int64
- name: '262'
dtype: int64
- name: '263'
dtype: int64
- name: '264'
dtype: int64
- name: '265'
dtype: int64
- name: '266'
dtype: int64
- name: '267'
dtype: int64
- name: '268'
dtype: int64
- name: '269'
dtype: int64
- name: '27'
dtype: int64
- name: '270'
dtype: int64
- name: '271'
dtype: int64
- name: '272'
dtype: int64
- name: '273'
dtype: int64
- name: '274'
dtype: int64
- name: '275'
dtype: int64
- name: '276'
dtype: int64
- name: '277'
dtype: int64
- name: '278'
dtype: int64
- name: '279'
dtype: int64
- name: '28'
dtype: int64
- name: '280'
dtype: int64
- name: '281'
dtype: int64
- name: '282'
dtype: int64
- name: '283'
dtype: int64
- name: '284'
dtype: int64
- name: '285'
dtype: int64
- name: '286'
dtype: int64
- name: '287'
dtype: int64
- name: '288'
dtype: int64
- name: '289'
dtype: int64
- name: '29'
dtype: int64
- name: '290'
dtype: int64
- name: '291'
dtype: int64
- name: '292'
dtype: int64
- name: '293'
dtype: int64
- name: '294'
dtype: int64
- name: '295'
dtype: int64
- name: '296'
dtype: int64
- name: '297'
dtype: int64
- name: '298'
dtype: int64
- name: '299'
dtype: int64
- name: '3'
dtype: int64
- name: '30'
dtype: int64
- name: '300'
dtype: int64
- name: '301'
dtype: int64
- name: '302'
dtype: int64
- name: '303'
dtype: int64
- name: '304'
dtype: int64
- name: '305'
dtype: int64
- name: '306'
dtype: int64
- name: '307'
dtype: int64
- name: '308'
dtype: int64
- name: '309'
dtype: int64
- name: '31'
dtype: int64
- name: '310'
dtype: int64
- name: '311'
dtype: int64
- name: '312'
dtype: int64
- name: '313'
dtype: int64
- name: '314'
dtype: int64
- name: '315'
dtype: int64
- name: '316'
dtype: int64
- name: '317'
dtype: int64
- name: '318'
dtype: int64
- name: '319'
dtype: int64
- name: '32'
dtype: int64
- name: '320'
dtype: int64
- name: '321'
dtype: int64
- name: '322'
dtype: int64
- name: '323'
dtype: int64
- name: '324'
dtype: int64
- name: '325'
dtype: int64
- name: '326'
dtype: int64
- name: '327'
dtype: int64
- name: '328'
dtype: int64
- name: '329'
dtype: int64
- name: '33'
dtype: int64
- name: '330'
dtype: int64
- name: '331'
dtype: int64
- name: '332'
dtype: int64
- name: '333'
dtype: int64
- name: '334'
dtype: int64
- name: '335'
dtype: int64
- name: '336'
dtype: int64
- name: '337'
dtype: int64
- name: '338'
dtype: int64
- name: '339'
dtype: int64
- name: '34'
dtype: int64
- name: '340'
dtype: int64
- name: '341'
dtype: int64
- name: '342'
dtype: int64
- name: '343'
dtype: int64
- name: '344'
dtype: int64
- name: '345'
dtype: int64
- name: '346'
dtype: int64
- name: '347'
dtype: int64
- name: '348'
dtype: int64
- name: '349'
dtype: int64
- name: '35'
dtype: int64
- name: '350'
dtype: int64
- name: '351'
dtype: int64
- name: '352'
dtype: int64
- name: '353'
dtype: int64
- name: '354'
dtype: int64
- name: '355'
dtype: int64
- name: '356'
dtype: int64
- name: '357'
dtype: int64
- name: '358'
dtype: int64
- name: '359'
dtype: int64
- name: '36'
dtype: int64
- name: '360'
dtype: int64
- name: '361'
dtype: int64
- name: '362'
dtype: int64
- name: '363'
dtype: int64
- name: '364'
dtype: int64
- name: '365'
dtype: int64
- name: '366'
dtype: int64
- name: '367'
dtype: int64
- name: '368'
dtype: int64
- name: '369'
dtype: int64
- name: '37'
dtype: int64
- name: '370'
dtype: int64
- name: '371'
dtype: int64
- name: '372'
dtype: int64
- name: '373'
dtype: int64
- name: '374'
dtype: int64
- name: '375'
dtype: int64
- name: '376'
dtype: int64
- name: '377'
dtype: int64
- name: '378'
dtype: int64
- name: '379'
dtype: int64
- name: '38'
dtype: int64
- name: '380'
dtype: int64
- name: '381'
dtype: int64
- name: '382'
dtype: int64
- name: '383'
dtype: int64
- name: '384'
dtype: int64
- name: '385'
dtype: int64
- name: '386'
dtype: int64
- name: '387'
dtype: int64
- name: '388'
dtype: int64
- name: '389'
dtype: int64
- name: '39'
dtype: int64
- name: '390'
dtype: int64
- name: '391'
dtype: int64
- name: '392'
dtype: int64
- name: '393'
dtype: int64
- name: '394'
dtype: int64
- name: '395'
dtype: int64
- name: '396'
dtype: int64
- name: '397'
dtype: int64
- name: '4'
dtype: int64
- name: '40'
dtype: int64
- name: '41'
dtype: int64
- name: '42'
dtype: int64
- name: '43'
dtype: int64
- name: '44'
dtype: int64
- name: '45'
dtype: int64
- name: '46'
dtype: int64
- name: '47'
dtype: int64
- name: '48'
dtype: int64
- name: '49'
dtype: int64
- name: '5'
dtype: int64
- name: '50'
dtype: int64
- name: '51'
dtype: int64
- name: '52'
dtype: int64
- name: '53'
dtype: int64
- name: '54'
dtype: int64
- name: '55'
dtype: int64
- name: '56'
dtype: int64
- name: '57'
dtype: int64
- name: '58'
dtype: int64
- name: '59'
dtype: int64
- name: '6'
dtype: int64
- name: '60'
dtype: int64
- name: '61'
dtype: int64
- name: '62'
dtype: int64
- name: '63'
dtype: int64
- name: '64'
dtype: int64
- name: '65'
dtype: int64
- name: '66'
dtype: int64
- name: '67'
dtype: int64
- name: '68'
dtype: int64
- name: '69'
dtype: int64
- name: '7'
dtype: int64
- name: '70'
dtype: int64
- name: '71'
dtype: int64
- name: '72'
dtype: int64
- name: '73'
dtype: int64
- name: '74'
dtype: int64
- name: '75'
dtype: int64
- name: '76'
dtype: int64
- name: '77'
dtype: int64
- name: '78'
dtype: int64
- name: '79'
dtype: int64
- name: '8'
dtype: int64
- name: '80'
dtype: int64
- name: '81'
dtype: int64
- name: '82'
dtype: int64
- name: '83'
dtype: int64
- name: '84'
dtype: int64
- name: '85'
dtype: int64
- name: '86'
dtype: int64
- name: '87'
dtype: int64
- name: '88'
dtype: int64
- name: '89'
dtype: int64
- name: '9'
dtype: int64
- name: '90'
dtype: int64
- name: '91'
dtype: int64
- name: '92'
dtype: int64
- name: '93'
dtype: int64
- name: '94'
dtype: int64
- name: '95'
dtype: int64
- name: '96'
dtype: int64
- name: '97'
dtype: int64
- name: '98'
dtype: int64
- name: '99'
dtype: int64
- name: en_tokens
struct:
- name: '0'
dtype: string
- name: '1'
dtype: string
- name: '10'
dtype: string
- name: '100'
dtype: string
- name: '101'
dtype: string
- name: '102'
dtype: string
- name: '103'
dtype: string
- name: '104'
dtype: string
- name: '105'
dtype: string
- name: '106'
dtype: string
- name: '107'
dtype: string
- name: '108'
dtype: string
- name: '109'
dtype: string
- name: '11'
dtype: string
- name: '110'
dtype: string
- name: '111'
dtype: string
- name: '112'
dtype: string
- name: '113'
dtype: string
- name: '114'
dtype: string
- name: '115'
dtype: string
- name: '116'
dtype: string
- name: '117'
dtype: string
- name: '118'
dtype: string
- name: '119'
dtype: string
- name: '12'
dtype: string
- name: '120'
dtype: string
- name: '121'
dtype: string
- name: '122'
dtype: string
- name: '123'
dtype: string
- name: '124'
dtype: string
- name: '125'
dtype: string
- name: '126'
dtype: string
- name: '127'
dtype: string
- name: '128'
dtype: string
- name: '129'
dtype: string
- name: '13'
dtype: string
- name: '130'
dtype: string
- name: '131'
dtype: string
- name: '132'
dtype: string
- name: '133'
dtype: string
- name: '134'
dtype: string
- name: '135'
dtype: string
- name: '136'
dtype: string
- name: '137'
dtype: string
- name: '138'
dtype: string
- name: '139'
dtype: string
- name: '14'
dtype: string
- name: '140'
dtype: string
- name: '141'
dtype: string
- name: '142'
dtype: string
- name: '143'
dtype: string
- name: '144'
dtype: string
- name: '145'
dtype: string
- name: '146'
dtype: string
- name: '147'
dtype: string
- name: '148'
dtype: string
- name: '149'
dtype: string
- name: '15'
dtype: string
- name: '150'
dtype: string
- name: '151'
dtype: string
- name: '152'
dtype: string
- name: '153'
dtype: string
- name: '154'
dtype: string
- name: '155'
dtype: string
- name: '156'
dtype: string
- name: '157'
dtype: string
- name: '158'
dtype: string
- name: '159'
dtype: string
- name: '16'
dtype: string
- name: '160'
dtype: string
- name: '161'
dtype: string
- name: '162'
dtype: string
- name: '163'
dtype: string
- name: '164'
dtype: string
- name: '165'
dtype: string
- name: '166'
dtype: string
- name: '167'
dtype: string
- name: '168'
dtype: string
- name: '169'
dtype: string
- name: '17'
dtype: string
- name: '170'
dtype: string
- name: '171'
dtype: string
- name: '172'
dtype: string
- name: '173'
dtype: string
- name: '174'
dtype: string
- name: '175'
dtype: string
- name: '176'
dtype: string
- name: '177'
dtype: string
- name: '178'
dtype: string
- name: '179'
dtype: string
- name: '18'
dtype: string
- name: '180'
dtype: string
- name: '181'
dtype: string
- name: '182'
dtype: string
- name: '183'
dtype: string
- name: '184'
dtype: string
- name: '185'
dtype: string
- name: '186'
dtype: string
- name: '187'
dtype: string
- name: '188'
dtype: string
- name: '189'
dtype: string
- name: '19'
dtype: string
- name: '190'
dtype: string
- name: '191'
dtype: string
- name: '192'
dtype: string
- name: '193'
dtype: string
- name: '194'
dtype: string
- name: '195'
dtype: string
- name: '196'
dtype: string
- name: '197'
dtype: string
- name: '198'
dtype: string
- name: '199'
dtype: string
- name: '2'
dtype: string
- name: '20'
dtype: string
- name: '200'
dtype: string
- name: '201'
dtype: string
- name: '202'
dtype: string
- name: '203'
dtype: string
- name: '204'
dtype: string
- name: '205'
dtype: string
- name: '206'
dtype: string
- name: '207'
dtype: string
- name: '208'
dtype: string
- name: '209'
dtype: string
- name: '21'
dtype: string
- name: '210'
dtype: string
- name: '211'
dtype: string
- name: '212'
dtype: string
- name: '213'
dtype: string
- name: '214'
dtype: string
- name: '215'
dtype: string
- name: '216'
dtype: string
- name: '217'
dtype: string
- name: '218'
dtype: string
- name: '219'
dtype: string
- name: '22'
dtype: string
- name: '220'
dtype: string
- name: '221'
dtype: string
- name: '222'
dtype: string
- name: '223'
dtype: string
- name: '224'
dtype: string
- name: '225'
dtype: string
- name: '226'
dtype: string
- name: '227'
dtype: string
- name: '228'
dtype: string
- name: '229'
dtype: string
- name: '23'
dtype: string
- name: '230'
dtype: string
- name: '231'
dtype: string
- name: '232'
dtype: string
- name: '233'
dtype: string
- name: '234'
dtype: string
- name: '235'
dtype: string
- name: '236'
dtype: string
- name: '237'
dtype: string
- name: '238'
dtype: string
- name: '239'
dtype: string
- name: '24'
dtype: string
- name: '240'
dtype: string
- name: '241'
dtype: string
- name: '242'
dtype: string
- name: '243'
dtype: string
- name: '244'
dtype: string
- name: '245'
dtype: string
- name: '246'
dtype: string
- name: '247'
dtype: string
- name: '248'
dtype: string
- name: '249'
dtype: string
- name: '25'
dtype: string
- name: '250'
dtype: string
- name: '251'
dtype: string
- name: '252'
dtype: string
- name: '253'
dtype: string
- name: '254'
dtype: string
- name: '255'
dtype: string
- name: '256'
dtype: string
- name: '257'
dtype: string
- name: '258'
dtype: string
- name: '259'
dtype: string
- name: '26'
dtype: string
- name: '260'
dtype: string
- name: '261'
dtype: string
- name: '262'
dtype: string
- name: '263'
dtype: string
- name: '264'
dtype: string
- name: '265'
dtype: string
- name: '266'
dtype: string
- name: '267'
dtype: string
- name: '268'
dtype: string
- name: '269'
dtype: string
- name: '27'
dtype: string
- name: '270'
dtype: string
- name: '271'
dtype: string
- name: '272'
dtype: string
- name: '273'
dtype: string
- name: '274'
dtype: string
- name: '275'
dtype: string
- name: '276'
dtype: string
- name: '277'
dtype: string
- name: '278'
dtype: string
- name: '279'
dtype: string
- name: '28'
dtype: string
- name: '280'
dtype: string
- name: '281'
dtype: string
- name: '282'
dtype: string
- name: '283'
dtype: string
- name: '284'
dtype: string
- name: '285'
dtype: string
- name: '286'
dtype: string
- name: '287'
dtype: string
- name: '288'
dtype: string
- name: '289'
dtype: string
- name: '29'
dtype: string
- name: '290'
dtype: string
- name: '291'
dtype: string
- name: '292'
dtype: string
- name: '293'
dtype: string
- name: '294'
dtype: string
- name: '295'
dtype: string
- name: '296'
dtype: string
- name: '297'
dtype: string
- name: '298'
dtype: string
- name: '299'
dtype: string
- name: '3'
dtype: string
- name: '30'
dtype: string
- name: '300'
dtype: string
- name: '301'
dtype: string
- name: '302'
dtype: string
- name: '303'
dtype: string
- name: '304'
dtype: string
- name: '305'
dtype: string
- name: '306'
dtype: string
- name: '307'
dtype: string
- name: '308'
dtype: string
- name: '309'
dtype: string
- name: '31'
dtype: string
- name: '310'
dtype: string
- name: '311'
dtype: string
- name: '312'
dtype: string
- name: '313'
dtype: string
- name: '314'
dtype: string
- name: '315'
dtype: string
- name: '316'
dtype: string
- name: '317'
dtype: string
- name: '318'
dtype: string
- name: '319'
dtype: string
- name: '32'
dtype: string
- name: '320'
dtype: string
- name: '321'
dtype: string
- name: '322'
dtype: string
- name: '323'
dtype: string
- name: '324'
dtype: string
- name: '325'
dtype: string
- name: '326'
dtype: string
- name: '327'
dtype: string
- name: '328'
dtype: string
- name: '329'
dtype: string
- name: '33'
dtype: string
- name: '330'
dtype: string
- name: '331'
dtype: string
- name: '332'
dtype: string
- name: '333'
dtype: string
- name: '334'
dtype: string
- name: '335'
dtype: string
- name: '336'
dtype: string
- name: '337'
dtype: string
- name: '338'
dtype: string
- name: '339'
dtype: string
- name: '34'
dtype: string
- name: '340'
dtype: string
- name: '341'
dtype: string
- name: '342'
dtype: string
- name: '343'
dtype: string
- name: '344'
dtype: string
- name: '345'
dtype: string
- name: '346'
dtype: string
- name: '347'
dtype: string
- name: '348'
dtype: string
- name: '349'
dtype: string
- name: '35'
dtype: string
- name: '350'
dtype: string
- name: '351'
dtype: string
- name: '352'
dtype: string
- name: '353'
dtype: string
- name: '354'
dtype: string
- name: '355'
dtype: string
- name: '356'
dtype: string
- name: '357'
dtype: string
- name: '358'
dtype: string
- name: '359'
dtype: string
- name: '36'
dtype: string
- name: '360'
dtype: string
- name: '361'
dtype: string
- name: '362'
dtype: string
- name: '363'
dtype: string
- name: '364'
dtype: string
- name: '365'
dtype: string
- name: '366'
dtype: string
- name: '367'
dtype: string
- name: '368'
dtype: string
- name: '369'
dtype: string
- name: '37'
dtype: string
- name: '370'
dtype: string
- name: '371'
dtype: string
- name: '372'
dtype: string
- name: '373'
dtype: string
- name: '374'
dtype: string
- name: '375'
dtype: string
- name: '376'
dtype: string
- name: '377'
dtype: string
- name: '378'
dtype: string
- name: '379'
dtype: string
- name: '38'
dtype: string
- name: '380'
dtype: string
- name: '381'
dtype: string
- name: '382'
dtype: string
- name: '383'
dtype: string
- name: '384'
dtype: string
- name: '385'
dtype: string
- name: '386'
dtype: string
- name: '387'
dtype: string
- name: '388'
dtype: string
- name: '389'
dtype: string
- name: '39'
dtype: string
- name: '390'
dtype: string
- name: '391'
dtype: string
- name: '392'
dtype: string
- name: '393'
dtype: string
- name: '394'
dtype: string
- name: '395'
dtype: string
- name: '396'
dtype: string
- name: '397'
dtype: string
- name: '4'
dtype: string
- name: '40'
dtype: string
- name: '41'
dtype: string
- name: '42'
dtype: string
- name: '43'
dtype: string
- name: '44'
dtype: string
- name: '45'
dtype: string
- name: '46'
dtype: string
- name: '47'
dtype: string
- name: '48'
dtype: string
- name: '49'
dtype: string
- name: '5'
dtype: string
- name: '50'
dtype: string
- name: '51'
dtype: string
- name: '52'
dtype: string
- name: '53'
dtype: string
- name: '54'
dtype: string
- name: '55'
dtype: string
- name: '56'
dtype: string
- name: '57'
dtype: string
- name: '58'
dtype: string
- name: '59'
dtype: string
- name: '6'
dtype: string
- name: '60'
dtype: string
- name: '61'
dtype: string
- name: '62'
dtype: string
- name: '63'
dtype: string
- name: '64'
dtype: string
- name: '65'
dtype: string
- name: '66'
dtype: string
- name: '67'
dtype: string
- name: '68'
dtype: string
- name: '69'
dtype: string
- name: '7'
dtype: string
- name: '70'
dtype: string
- name: '71'
dtype: string
- name: '72'
dtype: string
- name: '73'
dtype: string
- name: '74'
dtype: string
- name: '75'
dtype: string
- name: '76'
dtype: string
- name: '77'
dtype: string
- name: '78'
dtype: string
- name: '79'
dtype: string
- name: '8'
dtype: string
- name: '80'
dtype: string
- name: '81'
dtype: string
- name: '82'
dtype: string
- name: '83'
dtype: string
- name: '84'
dtype: string
- name: '85'
dtype: string
- name: '86'
dtype: string
- name: '87'
dtype: string
- name: '88'
dtype: string
- name: '89'
dtype: string
- name: '9'
dtype: string
- name: '90'
dtype: string
- name: '91'
dtype: string
- name: '92'
dtype: string
- name: '93'
dtype: string
- name: '94'
dtype: string
- name: '95'
dtype: string
- name: '96'
dtype: string
- name: '97'
dtype: string
- name: '98'
dtype: string
- name: '99'
dtype: string
- name: lang_tokens
struct:
- name: '0'
dtype: string
- name: '1'
dtype: string
- name: '10'
dtype: string
- name: '100'
dtype: string
- name: '101'
dtype: string
- name: '102'
dtype: string
- name: '103'
dtype: string
- name: '104'
dtype: string
- name: '105'
dtype: string
- name: '106'
dtype: string
- name: '107'
dtype: string
- name: '108'
dtype: string
- name: '109'
dtype: string
- name: '11'
dtype: string
- name: '110'
dtype: string
- name: '111'
dtype: string
- name: '112'
dtype: string
- name: '113'
dtype: string
- name: '114'
dtype: string
- name: '115'
dtype: string
- name: '116'
dtype: string
- name: '117'
dtype: string
- name: '118'
dtype: string
- name: '119'
dtype: string
- name: '12'
dtype: string
- name: '120'
dtype: string
- name: '121'
dtype: string
- name: '122'
dtype: string
- name: '123'
dtype: string
- name: '124'
dtype: string
- name: '125'
dtype: string
- name: '126'
dtype: string
- name: '127'
dtype: string
- name: '128'
dtype: string
- name: '129'
dtype: string
- name: '13'
dtype: string
- name: '130'
dtype: string
- name: '131'
dtype: string
- name: '132'
dtype: string
- name: '133'
dtype: string
- name: '134'
dtype: string
- name: '135'
dtype: string
- name: '136'
dtype: string
- name: '137'
dtype: string
- name: '138'
dtype: string
- name: '139'
dtype: string
- name: '14'
dtype: string
- name: '140'
dtype: string
- name: '141'
dtype: string
- name: '142'
dtype: string
- name: '143'
dtype: string
- name: '144'
dtype: string
- name: '145'
dtype: string
- name: '146'
dtype: string
- name: '147'
dtype: string
- name: '148'
dtype: string
- name: '149'
dtype: string
- name: '15'
dtype: string
- name: '150'
dtype: string
- name: '151'
dtype: string
- name: '152'
dtype: string
- name: '153'
dtype: string
- name: '154'
dtype: string
- name: '155'
dtype: string
- name: '156'
dtype: string
- name: '157'
dtype: string
- name: '158'
dtype: string
- name: '159'
dtype: string
- name: '16'
dtype: string
- name: '160'
dtype: string
- name: '161'
dtype: string
- name: '162'
dtype: string
- name: '163'
dtype: string
- name: '164'
dtype: string
- name: '165'
dtype: string
- name: '166'
dtype: string
- name: '167'
dtype: string
- name: '168'
dtype: string
- name: '169'
dtype: string
- name: '17'
dtype: string
- name: '170'
dtype: string
- name: '171'
dtype: string
- name: '172'
dtype: string
- name: '173'
dtype: string
- name: '174'
dtype: string
- name: '175'
dtype: string
- name: '176'
dtype: string
- name: '177'
dtype: string
- name: '178'
dtype: string
- name: '179'
dtype: string
- name: '18'
dtype: string
- name: '180'
dtype: string
- name: '181'
dtype: string
- name: '182'
dtype: string
- name: '183'
dtype: string
- name: '184'
dtype: string
- name: '185'
dtype: string
- name: '186'
dtype: string
- name: '187'
dtype: string
- name: '188'
dtype: string
- name: '189'
dtype: string
- name: '19'
dtype: string
- name: '190'
dtype: string
- name: '191'
dtype: string
- name: '192'
dtype: string
- name: '193'
dtype: string
- name: '194'
dtype: string
- name: '195'
dtype: string
- name: '196'
dtype: string
- name: '197'
dtype: string
- name: '198'
dtype: string
- name: '199'
dtype: string
- name: '2'
dtype: string
- name: '20'
dtype: string
- name: '200'
dtype: string
- name: '201'
dtype: string
- name: '202'
dtype: string
- name: '203'
dtype: string
- name: '204'
dtype: string
- name: '205'
dtype: string
- name: '206'
dtype: string
- name: '207'
dtype: string
- name: '208'
dtype: string
- name: '209'
dtype: string
- name: '21'
dtype: string
- name: '210'
dtype: string
- name: '211'
dtype: string
- name: '212'
dtype: string
- name: '213'
dtype: string
- name: '214'
dtype: string
- name: '215'
dtype: string
- name: '216'
dtype: string
- name: '217'
dtype: string
- name: '218'
dtype: string
- name: '219'
dtype: string
- name: '22'
dtype: string
- name: '220'
dtype: string
- name: '221'
dtype: string
- name: '222'
dtype: string
- name: '223'
dtype: string
- name: '224'
dtype: string
- name: '225'
dtype: string
- name: '226'
dtype: string
- name: '227'
dtype: string
- name: '228'
dtype: string
- name: '229'
dtype: string
- name: '23'
dtype: string
- name: '230'
dtype: string
- name: '231'
dtype: string
- name: '232'
dtype: string
- name: '233'
dtype: string
- name: '234'
dtype: string
- name: '235'
dtype: string
- name: '236'
dtype: string
- name: '237'
dtype: string
- name: '238'
dtype: string
- name: '239'
dtype: string
- name: '24'
dtype: string
- name: '240'
dtype: string
- name: '241'
dtype: string
- name: '242'
dtype: string
- name: '243'
dtype: string
- name: '244'
dtype: string
- name: '245'
dtype: string
- name: '246'
dtype: string
- name: '247'
dtype: string
- name: '248'
dtype: string
- name: '249'
dtype: string
- name: '25'
dtype: string
- name: '250'
dtype: string
- name: '251'
dtype: string
- name: '252'
dtype: string
- name: '253'
dtype: string
- name: '254'
dtype: string
- name: '255'
dtype: string
- name: '256'
dtype: string
- name: '257'
dtype: string
- name: '258'
dtype: string
- name: '259'
dtype: string
- name: '26'
dtype: string
- name: '260'
dtype: string
- name: '261'
dtype: string
- name: '262'
dtype: string
- name: '263'
dtype: string
- name: '264'
dtype: string
- name: '265'
dtype: string
- name: '266'
dtype: string
- name: '267'
dtype: string
- name: '268'
dtype: string
- name: '269'
dtype: string
- name: '27'
dtype: string
- name: '270'
dtype: string
- name: '271'
dtype: string
- name: '272'
dtype: string
- name: '273'
dtype: string
- name: '274'
dtype: string
- name: '275'
dtype: string
- name: '276'
dtype: string
- name: '277'
dtype: string
- name: '278'
dtype: string
- name: '279'
dtype: string
- name: '28'
dtype: string
- name: '280'
dtype: string
- name: '281'
dtype: string
- name: '282'
dtype: string
- name: '283'
dtype: string
- name: '284'
dtype: string
- name: '285'
dtype: string
- name: '286'
dtype: string
- name: '287'
dtype: string
- name: '288'
dtype: string
- name: '289'
dtype: string
- name: '29'
dtype: string
- name: '290'
dtype: string
- name: '291'
dtype: string
- name: '292'
dtype: string
- name: '293'
dtype: string
- name: '294'
dtype: string
- name: '295'
dtype: string
- name: '296'
dtype: string
- name: '297'
dtype: string
- name: '298'
dtype: string
- name: '299'
dtype: string
- name: '3'
dtype: string
- name: '30'
dtype: string
- name: '300'
dtype: string
- name: '301'
dtype: string
- name: '302'
dtype: string
- name: '303'
dtype: string
- name: '304'
dtype: string
- name: '305'
dtype: string
- name: '306'
dtype: string
- name: '307'
dtype: string
- name: '308'
dtype: string
- name: '309'
dtype: string
- name: '31'
dtype: string
- name: '310'
dtype: string
- name: '311'
dtype: string
- name: '312'
dtype: string
- name: '313'
dtype: string
- name: '314'
dtype: string
- name: '315'
dtype: string
- name: '316'
dtype: string
- name: '317'
dtype: string
- name: '318'
dtype: string
- name: '319'
dtype: string
- name: '32'
dtype: string
- name: '320'
dtype: string
- name: '321'
dtype: string
- name: '322'
dtype: string
- name: '323'
dtype: string
- name: '324'
dtype: string
- name: '325'
dtype: string
- name: '326'
dtype: string
- name: '327'
dtype: string
- name: '328'
dtype: string
- name: '329'
dtype: string
- name: '33'
dtype: string
- name: '330'
dtype: string
- name: '331'
dtype: string
- name: '332'
dtype: string
- name: '333'
dtype: string
- name: '334'
dtype: string
- name: '335'
dtype: string
- name: '336'
dtype: string
- name: '337'
dtype: string
- name: '338'
dtype: string
- name: '339'
dtype: string
- name: '34'
dtype: string
- name: '340'
dtype: string
- name: '341'
dtype: string
- name: '342'
dtype: string
- name: '343'
dtype: string
- name: '344'
dtype: string
- name: '345'
dtype: string
- name: '346'
dtype: string
- name: '347'
dtype: string
- name: '348'
dtype: string
- name: '349'
dtype: string
- name: '35'
dtype: string
- name: '350'
dtype: string
- name: '351'
dtype: string
- name: '352'
dtype: string
- name: '353'
dtype: string
- name: '354'
dtype: string
- name: '355'
dtype: string
- name: '356'
dtype: string
- name: '357'
dtype: string
- name: '358'
dtype: string
- name: '359'
dtype: string
- name: '36'
dtype: string
- name: '360'
dtype: string
- name: '361'
dtype: string
- name: '362'
dtype: string
- name: '363'
dtype: string
- name: '364'
dtype: string
- name: '365'
dtype: string
- name: '366'
dtype: string
- name: '367'
dtype: string
- name: '368'
dtype: string
- name: '369'
dtype: string
- name: '37'
dtype: string
- name: '370'
dtype: string
- name: '371'
dtype: string
- name: '372'
dtype: string
- name: '373'
dtype: string
- name: '374'
dtype: string
- name: '375'
dtype: string
- name: '376'
dtype: string
- name: '38'
dtype: string
- name: '39'
dtype: string
- name: '4'
dtype: string
- name: '40'
dtype: string
- name: '41'
dtype: string
- name: '42'
dtype: string
- name: '43'
dtype: string
- name: '44'
dtype: string
- name: '45'
dtype: string
- name: '46'
dtype: string
- name: '47'
dtype: string
- name: '48'
dtype: string
- name: '49'
dtype: string
- name: '5'
dtype: string
- name: '50'
dtype: string
- name: '51'
dtype: string
- name: '52'
dtype: string
- name: '53'
dtype: string
- name: '54'
dtype: string
- name: '55'
dtype: string
- name: '56'
dtype: string
- name: '57'
dtype: string
- name: '58'
dtype: string
- name: '59'
dtype: string
- name: '6'
dtype: string
- name: '60'
dtype: string
- name: '61'
dtype: string
- name: '62'
dtype: string
- name: '63'
dtype: string
- name: '64'
dtype: string
- name: '65'
dtype: string
- name: '66'
dtype: string
- name: '67'
dtype: string
- name: '68'
dtype: string
- name: '69'
dtype: string
- name: '7'
dtype: string
- name: '70'
dtype: string
- name: '71'
dtype: string
- name: '72'
dtype: string
- name: '73'
dtype: string
- name: '74'
dtype: string
- name: '75'
dtype: string
- name: '76'
dtype: string
- name: '77'
dtype: string
- name: '78'
dtype: string
- name: '79'
dtype: string
- name: '8'
dtype: string
- name: '80'
dtype: string
- name: '81'
dtype: string
- name: '82'
dtype: string
- name: '83'
dtype: string
- name: '84'
dtype: string
- name: '85'
dtype: string
- name: '86'
dtype: string
- name: '87'
dtype: string
- name: '88'
dtype: string
- name: '89'
dtype: string
- name: '9'
dtype: string
- name: '90'
dtype: string
- name: '91'
dtype: string
- name: '92'
dtype: string
- name: '93'
dtype: string
- name: '94'
dtype: string
- name: '95'
dtype: string
- name: '96'
dtype: string
- name: '97'
dtype: string
- name: '98'
dtype: string
- name: '99'
dtype: string
- name: parse
list:
- name: children
list:
- name: children
list:
- name: children
sequence: 'null'
- name: confidence
dtype: float64
- name: label
dtype: string
- name: span
sequence: int64
- name: confidence
dtype: float64
- name: label
dtype: string
- name: span
sequence: int64
- name: confidence
dtype: float64
- name: label
dtype: string
- name: span
sequence: int64
- name: text
sequence: string
- name: qa_pairs
list:
- name: en_answer
dtype: string
- name: en_answer_tokens
sequence: string
- name: en_match_in_passage
sequence: int64
- name: en_matches_in_source
sequence:
sequence: int64
- name: frames
list:
- name: argument
dtype: string
- name: frame
dtype: string
- name: lang_answer
dtype: string
- name: lang_match_in_passage
sequence: int64
- name: lang_matches_in_source
sequence:
sequence: int64
- name: match_disambiguated_question
dtype: string
- name: passage
sequence: string
- name: passage_id
dtype: string
- name: question
dtype: string
- name: repetitious_translation
dtype: bool
- name: source_lang
dtype: string
- name: source_text
dtype: string
- name: source_url
dtype: string
- name: translation
dtype: string
- name: translation_probs
sequence: string
- name: translation_sents
sequence: string
splits:
- name: my
num_bytes: 1457184817
num_examples: 58619
download_size: 0
dataset_size: 1457184817
- config_name: my_refined
features:
- name: article_title
dtype: string
- name: article_text
dtype: string
- name: entries
list:
- name: id
dtype: string
- name: original
dtype: string
- name: original_sents
sequence: string
- name: parse_tokens
sequence:
sequence: string
- name: passage
struct:
- name: en_lang_token_map
struct:
- name: '0'
dtype: int64
- name: '1'
dtype: int64
- name: '10'
dtype: int64
- name: '100'
dtype: int64
- name: '101'
dtype: int64
- name: '102'
dtype: int64
- name: '103'
dtype: int64
- name: '104'
dtype: int64
- name: '105'
dtype: int64
- name: '106'
dtype: int64
- name: '107'
dtype: int64
- name: '108'
dtype: int64
- name: '109'
dtype: int64
- name: '11'
dtype: int64
- name: '110'
dtype: int64
- name: '111'
dtype: int64
- name: '112'
dtype: int64
- name: '113'
dtype: int64
- name: '114'
dtype: int64
- name: '115'
dtype: int64
- name: '116'
dtype: int64
- name: '117'
dtype: int64
- name: '118'
dtype: int64
- name: '119'
dtype: int64
- name: '12'
dtype: int64
- name: '120'
dtype: int64
- name: '121'
dtype: int64
- name: '122'
dtype: int64
- name: '123'
dtype: int64
- name: '124'
dtype: int64
- name: '125'
dtype: int64
- name: '126'
dtype: int64
- name: '127'
dtype: int64
- name: '128'
dtype: int64
- name: '129'
dtype: int64
- name: '13'
dtype: int64
- name: '130'
dtype: int64
- name: '131'
dtype: int64
- name: '132'
dtype: int64
- name: '133'
dtype: int64
- name: '134'
dtype: int64
- name: '135'
dtype: int64
- name: '136'
dtype: int64
- name: '137'
dtype: int64
- name: '138'
dtype: int64
- name: '139'
dtype: int64
- name: '14'
dtype: int64
- name: '140'
dtype: 'null'
- name: '141'
dtype: int64
- name: '142'
dtype: int64
- name: '143'
dtype: int64
- name: '144'
dtype: int64
- name: '145'
dtype: int64
- name: '146'
dtype: int64
- name: '147'
dtype: int64
- name: '148'
dtype: int64
- name: '149'
dtype: int64
- name: '15'
dtype: int64
- name: '150'
dtype: int64
- name: '151'
dtype: int64
- name: '152'
dtype: int64
- name: '153'
dtype: int64
- name: '154'
dtype: int64
- name: '155'
dtype: 'null'
- name: '156'
dtype: int64
- name: '157'
dtype: int64
- name: '158'
dtype: int64
- name: '159'
dtype: int64
- name: '16'
dtype: int64
- name: '160'
dtype: int64
- name: '161'
dtype: int64
- name: '162'
dtype: int64
- name: '163'
dtype: int64
- name: '164'
dtype: int64
- name: '165'
dtype: int64
- name: '166'
dtype: 'null'
- name: '167'
dtype: int64
- name: '168'
dtype: int64
- name: '169'
dtype: 'null'
- name: '17'
dtype: int64
- name: '170'
dtype: int64
- name: '171'
dtype: int64
- name: '172'
dtype: int64
- name: '173'
dtype: int64
- name: '174'
dtype: int64
- name: '175'
dtype: int64
- name: '176'
dtype: int64
- name: '177'
dtype: int64
- name: '178'
dtype: int64
- name: '179'
dtype: int64
- name: '18'
dtype: int64
- name: '180'
dtype: int64
- name: '181'
dtype: int64
- name: '182'
dtype: 'null'
- name: '183'
dtype: int64
- name: '184'
dtype: int64
- name: '185'
dtype: int64
- name: '186'
dtype: 'null'
- name: '187'
dtype: 'null'
- name: '188'
dtype: int64
- name: '189'
dtype: int64
- name: '19'
dtype: int64
- name: '190'
dtype: int64
- name: '191'
dtype: int64
- name: '192'
dtype: int64
- name: '193'
dtype: int64
- name: '194'
dtype: int64
- name: '195'
dtype: int64
- name: '196'
dtype: int64
- name: '197'
dtype: int64
- name: '198'
dtype: int64
- name: '199'
dtype: int64
- name: '2'
dtype: int64
- name: '20'
dtype: int64
- name: '200'
dtype: int64
- name: '201'
dtype: int64
- name: '202'
dtype: 'null'
- name: '203'
dtype: 'null'
- name: '204'
dtype: int64
- name: '205'
dtype: int64
- name: '206'
dtype: 'null'
- name: '207'
dtype: int64
- name: '208'
dtype: 'null'
- name: '209'
dtype: int64
- name: '21'
dtype: int64
- name: '210'
dtype: int64
- name: '211'
dtype: int64
- name: '212'
dtype: 'null'
- name: '213'
dtype: int64
- name: '214'
dtype: int64
- name: '215'
dtype: int64
- name: '216'
dtype: 'null'
- name: '217'
dtype: int64
- name: '218'
dtype: int64
- name: '219'
dtype: 'null'
- name: '22'
dtype: int64
- name: '220'
dtype: int64
- name: '221'
dtype: int64
- name: '222'
dtype: int64
- name: '223'
dtype: int64
- name: '224'
dtype: 'null'
- name: '225'
dtype: int64
- name: '226'
dtype: 'null'
- name: '227'
dtype: 'null'
- name: '228'
dtype: int64
- name: '229'
dtype: int64
- name: '23'
dtype: int64
- name: '230'
dtype: 'null'
- name: '231'
dtype: int64
- name: '232'
dtype: int64
- name: '233'
dtype: int64
- name: '234'
dtype: 'null'
- name: '235'
dtype: int64
- name: '236'
dtype: int64
- name: '237'
dtype: 'null'
- name: '238'
dtype: int64
- name: '239'
dtype: int64
- name: '24'
dtype: int64
- name: '240'
dtype: int64
- name: '241'
dtype: int64
- name: '242'
dtype: int64
- name: '243'
dtype: int64
- name: '244'
dtype: int64
- name: '245'
dtype: int64
- name: '246'
dtype: 'null'
- name: '247'
dtype: 'null'
- name: '248'
dtype: 'null'
- name: '249'
dtype: int64
- name: '25'
dtype: int64
- name: '250'
dtype: int64
- name: '251'
dtype: 'null'
- name: '252'
dtype: 'null'
- name: '253'
dtype: int64
- name: '254'
dtype: int64
- name: '255'
dtype: 'null'
- name: '256'
dtype: int64
- name: '257'
dtype: 'null'
- name: '258'
dtype: 'null'
- name: '259'
dtype: int64
- name: '26'
dtype: int64
- name: '260'
dtype: 'null'
- name: '261'
dtype: 'null'
- name: '262'
dtype: int64
- name: '263'
dtype: int64
- name: '264'
dtype: 'null'
- name: '265'
dtype: 'null'
- name: '266'
dtype: 'null'
- name: '267'
dtype: int64
- name: '268'
dtype: 'null'
- name: '269'
dtype: int64
- name: '27'
dtype: int64
- name: '270'
dtype: int64
- name: '271'
dtype: 'null'
- name: '272'
dtype: 'null'
- name: '273'
dtype: 'null'
- name: '274'
dtype: int64
- name: '275'
dtype: int64
- name: '276'
dtype: int64
- name: '277'
dtype: int64
- name: '278'
dtype: 'null'
- name: '279'
dtype: int64
- name: '28'
dtype: int64
- name: '280'
dtype: int64
- name: '281'
dtype: int64
- name: '282'
dtype: int64
- name: '283'
dtype: int64
- name: '284'
dtype: int64
- name: '285'
dtype: int64
- name: '286'
dtype: 'null'
- name: '287'
dtype: 'null'
- name: '288'
dtype: 'null'
- name: '289'
dtype: 'null'
- name: '29'
dtype: int64
- name: '290'
dtype: int64
- name: '291'
dtype: 'null'
- name: '292'
dtype: 'null'
- name: '293'
dtype: 'null'
- name: '294'
dtype: 'null'
- name: '295'
dtype: 'null'
- name: '296'
dtype: 'null'
- name: '297'
dtype: 'null'
- name: '298'
dtype: 'null'
- name: '299'
dtype: 'null'
- name: '3'
dtype: int64
- name: '30'
dtype: int64
- name: '300'
dtype: 'null'
- name: '301'
dtype: 'null'
- name: '302'
dtype: 'null'
- name: '303'
dtype: 'null'
- name: '304'
dtype: 'null'
- name: '305'
dtype: 'null'
- name: '306'
dtype: 'null'
- name: '307'
dtype: 'null'
- name: '308'
dtype: 'null'
- name: '309'
dtype: 'null'
- name: '31'
dtype: int64
- name: '310'
dtype: 'null'
- name: '311'
dtype: 'null'
- name: '312'
dtype: 'null'
- name: '313'
dtype: 'null'
- name: '314'
dtype: 'null'
- name: '315'
dtype: 'null'
- name: '316'
dtype: 'null'
- name: '317'
dtype: 'null'
- name: '318'
dtype: 'null'
- name: '319'
dtype: 'null'
- name: '32'
dtype: int64
- name: '320'
dtype: 'null'
- name: '321'
dtype: 'null'
- name: '322'
dtype: 'null'
- name: '323'
dtype: 'null'
- name: '324'
dtype: 'null'
- name: '325'
dtype: 'null'
- name: '326'
dtype: 'null'
- name: '327'
dtype: 'null'
- name: '328'
dtype: 'null'
- name: '329'
dtype: 'null'
- name: '33'
dtype: int64
- name: '330'
dtype: 'null'
- name: '331'
dtype: 'null'
- name: '332'
dtype: 'null'
- name: '333'
dtype: 'null'
- name: '334'
dtype: 'null'
- name: '335'
dtype: 'null'
- name: '336'
dtype: 'null'
- name: '337'
dtype: 'null'
- name: '338'
dtype: 'null'
- name: '339'
dtype: 'null'
- name: '34'
dtype: int64
- name: '340'
dtype: 'null'
- name: '341'
dtype: 'null'
- name: '342'
dtype: 'null'
- name: '343'
dtype: 'null'
- name: '344'
dtype: 'null'
- name: '345'
dtype: 'null'
- name: '346'
dtype: 'null'
- name: '347'
dtype: 'null'
- name: '348'
dtype: 'null'
- name: '349'
dtype: 'null'
- name: '35'
dtype: int64
- name: '350'
dtype: 'null'
- name: '351'
dtype: 'null'
- name: '352'
dtype: 'null'
- name: '353'
dtype: 'null'
- name: '354'
dtype: 'null'
- name: '355'
dtype: 'null'
- name: '356'
dtype: 'null'
- name: '357'
dtype: 'null'
- name: '358'
dtype: 'null'
- name: '359'
dtype: 'null'
- name: '36'
dtype: int64
- name: '360'
dtype: 'null'
- name: '361'
dtype: 'null'
- name: '362'
dtype: 'null'
- name: '363'
dtype: 'null'
- name: '364'
dtype: 'null'
- name: '365'
dtype: 'null'
- name: '366'
dtype: 'null'
- name: '367'
dtype: 'null'
- name: '368'
dtype: 'null'
- name: '369'
dtype: 'null'
- name: '37'
dtype: int64
- name: '370'
dtype: 'null'
- name: '371'
dtype: 'null'
- name: '372'
dtype: 'null'
- name: '373'
dtype: 'null'
- name: '374'
dtype: 'null'
- name: '375'
dtype: 'null'
- name: '376'
dtype: 'null'
- name: '377'
dtype: 'null'
- name: '378'
dtype: 'null'
- name: '379'
dtype: 'null'
- name: '38'
dtype: int64
- name: '380'
dtype: 'null'
- name: '381'
dtype: 'null'
- name: '382'
dtype: 'null'
- name: '383'
dtype: 'null'
- name: '384'
dtype: 'null'
- name: '385'
dtype: 'null'
- name: '386'
dtype: 'null'
- name: '387'
dtype: 'null'
- name: '388'
dtype: 'null'
- name: '389'
dtype: 'null'
- name: '39'
dtype: int64
- name: '390'
dtype: 'null'
- name: '391'
dtype: 'null'
- name: '392'
dtype: 'null'
- name: '393'
dtype: 'null'
- name: '394'
dtype: 'null'
- name: '395'
dtype: 'null'
- name: '396'
dtype: 'null'
- name: '397'
dtype: 'null'
- name: '4'
dtype: int64
- name: '40'
dtype: int64
- name: '41'
dtype: int64
- name: '42'
dtype: int64
- name: '43'
dtype: int64
- name: '44'
dtype: int64
- name: '45'
dtype: int64
- name: '46'
dtype: int64
- name: '47'
dtype: int64
- name: '48'
dtype: int64
- name: '49'
dtype: int64
- name: '5'
dtype: int64
- name: '50'
dtype: int64
- name: '51'
dtype: int64
- name: '52'
dtype: int64
- name: '53'
dtype: int64
- name: '54'
dtype: int64
- name: '55'
dtype: int64
- name: '56'
dtype: int64
- name: '57'
dtype: int64
- name: '58'
dtype: int64
- name: '59'
dtype: int64
- name: '6'
dtype: int64
- name: '60'
dtype: int64
- name: '61'
dtype: int64
- name: '62'
dtype: int64
- name: '63'
dtype: int64
- name: '64'
dtype: int64
- name: '65'
dtype: int64
- name: '66'
dtype: int64
- name: '67'
dtype: int64
- name: '68'
dtype: int64
- name: '69'
dtype: int64
- name: '7'
dtype: int64
- name: '70'
dtype: int64
- name: '71'
dtype: int64
- name: '72'
dtype: int64
- name: '73'
dtype: int64
- name: '74'
dtype: int64
- name: '75'
dtype: int64
- name: '76'
dtype: int64
- name: '77'
dtype: int64
- name: '78'
dtype: int64
- name: '79'
dtype: int64
- name: '8'
dtype: int64
- name: '80'
dtype: int64
- name: '81'
dtype: int64
- name: '82'
dtype: int64
- name: '83'
dtype: int64
- name: '84'
dtype: int64
- name: '85'
dtype: int64
- name: '86'
dtype: int64
- name: '87'
dtype: int64
- name: '88'
dtype: int64
- name: '89'
dtype: int64
- name: '9'
dtype: int64
- name: '90'
dtype: int64
- name: '91'
dtype: int64
- name: '92'
dtype: int64
- name: '93'
dtype: int64
- name: '94'
dtype: int64
- name: '95'
dtype: int64
- name: '96'
dtype: int64
- name: '97'
dtype: int64
- name: '98'
dtype: int64
- name: '99'
dtype: int64
- name: en_tokens
struct:
- name: '0'
dtype: string
- name: '1'
dtype: string
- name: '10'
dtype: string
- name: '100'
dtype: string
- name: '101'
dtype: string
- name: '102'
dtype: string
- name: '103'
dtype: string
- name: '104'
dtype: string
- name: '105'
dtype: string
- name: '106'
dtype: string
- name: '107'
dtype: string
- name: '108'
dtype: string
- name: '109'
dtype: string
- name: '11'
dtype: string
- name: '110'
dtype: string
- name: '111'
dtype: string
- name: '112'
dtype: string
- name: '113'
dtype: string
- name: '114'
dtype: string
- name: '115'
dtype: string
- name: '116'
dtype: string
- name: '117'
dtype: string
- name: '118'
dtype: string
- name: '119'
dtype: string
- name: '12'
dtype: string
- name: '120'
dtype: string
- name: '121'
dtype: string
- name: '122'
dtype: string
- name: '123'
dtype: string
- name: '124'
dtype: string
- name: '125'
dtype: string
- name: '126'
dtype: string
- name: '127'
dtype: string
- name: '128'
dtype: string
- name: '129'
dtype: string
- name: '13'
dtype: string
- name: '130'
dtype: string
- name: '131'
dtype: string
- name: '132'
dtype: string
- name: '133'
dtype: string
- name: '134'
dtype: string
- name: '135'
dtype: string
- name: '136'
dtype: string
- name: '137'
dtype: string
- name: '138'
dtype: string
- name: '139'
dtype: string
- name: '14'
dtype: string
- name: '140'
dtype: string
- name: '141'
dtype: string
- name: '142'
dtype: string
- name: '143'
dtype: string
- name: '144'
dtype: string
- name: '145'
dtype: string
- name: '146'
dtype: string
- name: '147'
dtype: string
- name: '148'
dtype: string
- name: '149'
dtype: string
- name: '15'
dtype: string
- name: '150'
dtype: string
- name: '151'
dtype: string
- name: '152'
dtype: string
- name: '153'
dtype: string
- name: '154'
dtype: string
- name: '155'
dtype: string
- name: '156'
dtype: string
- name: '157'
dtype: string
- name: '158'
dtype: string
- name: '159'
dtype: string
- name: '16'
dtype: string
- name: '160'
dtype: string
- name: '161'
dtype: string
- name: '162'
dtype: string
- name: '163'
dtype: string
- name: '164'
dtype: string
- name: '165'
dtype: string
- name: '166'
dtype: string
- name: '167'
dtype: string
- name: '168'
dtype: string
- name: '169'
dtype: string
- name: '17'
dtype: string
- name: '170'
dtype: string
- name: '171'
dtype: string
- name: '172'
dtype: string
- name: '173'
dtype: string
- name: '174'
dtype: string
- name: '175'
dtype: string
- name: '176'
dtype: string
- name: '177'
dtype: string
- name: '178'
dtype: string
- name: '179'
dtype: string
- name: '18'
dtype: string
- name: '180'
dtype: string
- name: '181'
dtype: string
- name: '182'
dtype: string
- name: '183'
dtype: string
- name: '184'
dtype: string
- name: '185'
dtype: string
- name: '186'
dtype: string
- name: '187'
dtype: string
- name: '188'
dtype: string
- name: '189'
dtype: string
- name: '19'
dtype: string
- name: '190'
dtype: string
- name: '191'
dtype: string
- name: '192'
dtype: string
- name: '193'
dtype: string
- name: '194'
dtype: string
- name: '195'
dtype: string
- name: '196'
dtype: string
- name: '197'
dtype: string
- name: '198'
dtype: string
- name: '199'
dtype: string
- name: '2'
dtype: string
- name: '20'
dtype: string
- name: '200'
dtype: string
- name: '201'
dtype: string
- name: '202'
dtype: string
- name: '203'
dtype: string
- name: '204'
dtype: string
- name: '205'
dtype: string
- name: '206'
dtype: string
- name: '207'
dtype: string
- name: '208'
dtype: string
- name: '209'
dtype: string
- name: '21'
dtype: string
- name: '210'
dtype: string
- name: '211'
dtype: string
- name: '212'
dtype: string
- name: '213'
dtype: string
- name: '214'
dtype: string
- name: '215'
dtype: string
- name: '216'
dtype: string
- name: '217'
dtype: string
- name: '218'
dtype: string
- name: '219'
dtype: string
- name: '22'
dtype: string
- name: '220'
dtype: string
- name: '221'
dtype: string
- name: '222'
dtype: string
- name: '223'
dtype: string
- name: '224'
dtype: string
- name: '225'
dtype: string
- name: '226'
dtype: string
- name: '227'
dtype: string
- name: '228'
dtype: string
- name: '229'
dtype: string
- name: '23'
dtype: string
- name: '230'
dtype: string
- name: '231'
dtype: string
- name: '232'
dtype: string
- name: '233'
dtype: string
- name: '234'
dtype: string
- name: '235'
dtype: string
- name: '236'
dtype: string
- name: '237'
dtype: string
- name: '238'
dtype: string
- name: '239'
dtype: string
- name: '24'
dtype: string
- name: '240'
dtype: string
- name: '241'
dtype: string
- name: '242'
dtype: string
- name: '243'
dtype: string
- name: '244'
dtype: string
- name: '245'
dtype: string
- name: '246'
dtype: string
- name: '247'
dtype: string
- name: '248'
dtype: string
- name: '249'
dtype: string
- name: '25'
dtype: string
- name: '250'
dtype: string
- name: '251'
dtype: string
- name: '252'
dtype: string
- name: '253'
dtype: string
- name: '254'
dtype: string
- name: '255'
dtype: string
- name: '256'
dtype: string
- name: '257'
dtype: string
- name: '258'
dtype: string
- name: '259'
dtype: string
- name: '26'
dtype: string
- name: '260'
dtype: string
- name: '261'
dtype: string
- name: '262'
dtype: string
- name: '263'
dtype: string
- name: '264'
dtype: string
- name: '265'
dtype: string
- name: '266'
dtype: string
- name: '267'
dtype: string
- name: '268'
dtype: string
- name: '269'
dtype: string
- name: '27'
dtype: string
- name: '270'
dtype: string
- name: '271'
dtype: string
- name: '272'
dtype: string
- name: '273'
dtype: string
- name: '274'
dtype: string
- name: '275'
dtype: string
- name: '276'
dtype: string
- name: '277'
dtype: string
- name: '278'
dtype: string
- name: '279'
dtype: string
- name: '28'
dtype: string
- name: '280'
dtype: string
- name: '281'
dtype: string
- name: '282'
dtype: string
- name: '283'
dtype: string
- name: '284'
dtype: string
- name: '285'
dtype: string
- name: '286'
dtype: string
- name: '287'
dtype: string
- name: '288'
dtype: string
- name: '289'
dtype: string
- name: '29'
dtype: string
- name: '290'
dtype: string
- name: '291'
dtype: 'null'
- name: '292'
dtype: 'null'
- name: '293'
dtype: 'null'
- name: '294'
dtype: 'null'
- name: '295'
dtype: 'null'
- name: '296'
dtype: 'null'
- name: '297'
dtype: 'null'
- name: '298'
dtype: 'null'
- name: '299'
dtype: 'null'
- name: '3'
dtype: string
- name: '30'
dtype: string
- name: '300'
dtype: 'null'
- name: '301'
dtype: 'null'
- name: '302'
dtype: 'null'
- name: '303'
dtype: 'null'
- name: '304'
dtype: 'null'
- name: '305'
dtype: 'null'
- name: '306'
dtype: 'null'
- name: '307'
dtype: 'null'
- name: '308'
dtype: 'null'
- name: '309'
dtype: 'null'
- name: '31'
dtype: string
- name: '310'
dtype: 'null'
- name: '311'
dtype: 'null'
- name: '312'
dtype: 'null'
- name: '313'
dtype: 'null'
- name: '314'
dtype: 'null'
- name: '315'
dtype: 'null'
- name: '316'
dtype: 'null'
- name: '317'
dtype: 'null'
- name: '318'
dtype: 'null'
- name: '319'
dtype: 'null'
- name: '32'
dtype: string
- name: '320'
dtype: 'null'
- name: '321'
dtype: 'null'
- name: '322'
dtype: 'null'
- name: '323'
dtype: 'null'
- name: '324'
dtype: 'null'
- name: '325'
dtype: 'null'
- name: '326'
dtype: 'null'
- name: '327'
dtype: 'null'
- name: '328'
dtype: 'null'
- name: '329'
dtype: 'null'
- name: '33'
dtype: string
- name: '330'
dtype: 'null'
- name: '331'
dtype: 'null'
- name: '332'
dtype: 'null'
- name: '333'
dtype: 'null'
- name: '334'
dtype: 'null'
- name: '335'
dtype: 'null'
- name: '336'
dtype: 'null'
- name: '337'
dtype: 'null'
- name: '338'
dtype: 'null'
- name: '339'
dtype: 'null'
- name: '34'
dtype: string
- name: '340'
dtype: 'null'
- name: '341'
dtype: 'null'
- name: '342'
dtype: 'null'
- name: '343'
dtype: 'null'
- name: '344'
dtype: 'null'
- name: '345'
dtype: 'null'
- name: '346'
dtype: 'null'
- name: '347'
dtype: 'null'
- name: '348'
dtype: 'null'
- name: '349'
dtype: 'null'
- name: '35'
dtype: string
- name: '350'
dtype: 'null'
- name: '351'
dtype: 'null'
- name: '352'
dtype: 'null'
- name: '353'
dtype: 'null'
- name: '354'
dtype: 'null'
- name: '355'
dtype: 'null'
- name: '356'
dtype: 'null'
- name: '357'
dtype: 'null'
- name: '358'
dtype: 'null'
- name: '359'
dtype: 'null'
- name: '36'
dtype: string
- name: '360'
dtype: 'null'
- name: '361'
dtype: 'null'
- name: '362'
dtype: 'null'
- name: '363'
dtype: 'null'
- name: '364'
dtype: 'null'
- name: '365'
dtype: 'null'
- name: '366'
dtype: 'null'
- name: '367'
dtype: 'null'
- name: '368'
dtype: 'null'
- name: '369'
dtype: 'null'
- name: '37'
dtype: string
- name: '370'
dtype: 'null'
- name: '371'
dtype: 'null'
- name: '372'
dtype: 'null'
- name: '373'
dtype: 'null'
- name: '374'
dtype: 'null'
- name: '375'
dtype: 'null'
- name: '376'
dtype: 'null'
- name: '377'
dtype: 'null'
- name: '378'
dtype: 'null'
- name: '379'
dtype: 'null'
- name: '38'
dtype: string
- name: '380'
dtype: 'null'
- name: '381'
dtype: 'null'
- name: '382'
dtype: 'null'
- name: '383'
dtype: 'null'
- name: '384'
dtype: 'null'
- name: '385'
dtype: 'null'
- name: '386'
dtype: 'null'
- name: '387'
dtype: 'null'
- name: '388'
dtype: 'null'
- name: '389'
dtype: 'null'
- name: '39'
dtype: string
- name: '390'
dtype: 'null'
- name: '391'
dtype: 'null'
- name: '392'
dtype: 'null'
- name: '393'
dtype: 'null'
- name: '394'
dtype: 'null'
- name: '395'
dtype: 'null'
- name: '396'
dtype: 'null'
- name: '397'
dtype: 'null'
- name: '4'
dtype: string
- name: '40'
dtype: string
- name: '41'
dtype: string
- name: '42'
dtype: string
- name: '43'
dtype: string
- name: '44'
dtype: string
- name: '45'
dtype: string
- name: '46'
dtype: string
- name: '47'
dtype: string
- name: '48'
dtype: string
- name: '49'
dtype: string
- name: '5'
dtype: string
- name: '50'
dtype: string
- name: '51'
dtype: string
- name: '52'
dtype: string
- name: '53'
dtype: string
- name: '54'
dtype: string
- name: '55'
dtype: string
- name: '56'
dtype: string
- name: '57'
dtype: string
- name: '58'
dtype: string
- name: '59'
dtype: string
- name: '6'
dtype: string
- name: '60'
dtype: string
- name: '61'
dtype: string
- name: '62'
dtype: string
- name: '63'
dtype: string
- name: '64'
dtype: string
- name: '65'
dtype: string
- name: '66'
dtype: string
- name: '67'
dtype: string
- name: '68'
dtype: string
- name: '69'
dtype: string
- name: '7'
dtype: string
- name: '70'
dtype: string
- name: '71'
dtype: string
- name: '72'
dtype: string
- name: '73'
dtype: string
- name: '74'
dtype: string
- name: '75'
dtype: string
- name: '76'
dtype: string
- name: '77'
dtype: string
- name: '78'
dtype: string
- name: '79'
dtype: string
- name: '8'
dtype: string
- name: '80'
dtype: string
- name: '81'
dtype: string
- name: '82'
dtype: string
- name: '83'
dtype: string
- name: '84'
dtype: string
- name: '85'
dtype: string
- name: '86'
dtype: string
- name: '87'
dtype: string
- name: '88'
dtype: string
- name: '89'
dtype: string
- name: '9'
dtype: string
- name: '90'
dtype: string
- name: '91'
dtype: string
- name: '92'
dtype: string
- name: '93'
dtype: string
- name: '94'
dtype: string
- name: '95'
dtype: string
- name: '96'
dtype: string
- name: '97'
dtype: string
- name: '98'
dtype: string
- name: '99'
dtype: string
- name: lang_tokens
struct:
- name: '0'
dtype: string
- name: '1'
dtype: string
- name: '10'
dtype: string
- name: '100'
dtype: string
- name: '101'
dtype: string
- name: '102'
dtype: string
- name: '103'
dtype: string
- name: '104'
dtype: string
- name: '105'
dtype: string
- name: '106'
dtype: string
- name: '107'
dtype: string
- name: '108'
dtype: string
- name: '109'
dtype: string
- name: '11'
dtype: string
- name: '110'
dtype: string
- name: '111'
dtype: string
- name: '112'
dtype: string
- name: '113'
dtype: string
- name: '114'
dtype: string
- name: '115'
dtype: string
- name: '116'
dtype: string
- name: '117'
dtype: string
- name: '118'
dtype: string
- name: '119'
dtype: string
- name: '12'
dtype: string
- name: '120'
dtype: string
- name: '121'
dtype: string
- name: '122'
dtype: string
- name: '123'
dtype: string
- name: '124'
dtype: string
- name: '125'
dtype: string
- name: '126'
dtype: string
- name: '127'
dtype: string
- name: '128'
dtype: string
- name: '129'
dtype: string
- name: '13'
dtype: string
- name: '130'
dtype: string
- name: '131'
dtype: string
- name: '132'
dtype: string
- name: '133'
dtype: string
- name: '134'
dtype: string
- name: '135'
dtype: string
- name: '136'
dtype: string
- name: '137'
dtype: string
- name: '138'
dtype: string
- name: '139'
dtype: string
- name: '14'
dtype: string
- name: '140'
dtype: string
- name: '141'
dtype: string
- name: '142'
dtype: string
- name: '143'
dtype: string
- name: '144'
dtype: string
- name: '145'
dtype: string
- name: '146'
dtype: string
- name: '147'
dtype: string
- name: '148'
dtype: string
- name: '149'
dtype: string
- name: '15'
dtype: string
- name: '150'
dtype: string
- name: '151'
dtype: string
- name: '152'
dtype: string
- name: '153'
dtype: string
- name: '154'
dtype: string
- name: '155'
dtype: string
- name: '156'
dtype: string
- name: '157'
dtype: string
- name: '158'
dtype: string
- name: '159'
dtype: string
- name: '16'
dtype: string
- name: '160'
dtype: string
- name: '161'
dtype: string
- name: '162'
dtype: string
- name: '163'
dtype: string
- name: '164'
dtype: string
- name: '165'
dtype: string
- name: '166'
dtype: string
- name: '167'
dtype: string
- name: '168'
dtype: string
- name: '169'
dtype: string
- name: '17'
dtype: string
- name: '170'
dtype: string
- name: '171'
dtype: string
- name: '172'
dtype: string
- name: '173'
dtype: string
- name: '174'
dtype: string
- name: '175'
dtype: string
- name: '176'
dtype: string
- name: '177'
dtype: string
- name: '178'
dtype: string
- name: '179'
dtype: string
- name: '18'
dtype: string
- name: '180'
dtype: string
- name: '181'
dtype: string
- name: '182'
dtype: string
- name: '183'
dtype: string
- name: '184'
dtype: string
- name: '185'
dtype: string
- name: '186'
dtype: string
- name: '187'
dtype: string
- name: '188'
dtype: string
- name: '189'
dtype: string
- name: '19'
dtype: string
- name: '190'
dtype: string
- name: '191'
dtype: string
- name: '192'
dtype: string
- name: '193'
dtype: string
- name: '194'
dtype: string
- name: '195'
dtype: string
- name: '196'
dtype: string
- name: '197'
dtype: string
- name: '198'
dtype: string
- name: '199'
dtype: string
- name: '2'
dtype: string
- name: '20'
dtype: string
- name: '200'
dtype: string
- name: '201'
dtype: string
- name: '202'
dtype: string
- name: '203'
dtype: string
- name: '204'
dtype: string
- name: '205'
dtype: string
- name: '206'
dtype: string
- name: '207'
dtype: string
- name: '208'
dtype: string
- name: '209'
dtype: string
- name: '21'
dtype: string
- name: '210'
dtype: string
- name: '211'
dtype: string
- name: '212'
dtype: string
- name: '213'
dtype: string
- name: '214'
dtype: string
- name: '215'
dtype: string
- name: '216'
dtype: string
- name: '217'
dtype: string
- name: '218'
dtype: string
- name: '219'
dtype: string
- name: '22'
dtype: string
- name: '220'
dtype: string
- name: '221'
dtype: string
- name: '222'
dtype: string
- name: '223'
dtype: string
- name: '224'
dtype: string
- name: '225'
dtype: string
- name: '226'
dtype: string
- name: '227'
dtype: string
- name: '228'
dtype: string
- name: '229'
dtype: string
- name: '23'
dtype: string
- name: '230'
dtype: string
- name: '231'
dtype: string
- name: '232'
dtype: string
- name: '233'
dtype: string
- name: '234'
dtype: string
- name: '235'
dtype: string
- name: '236'
dtype: string
- name: '237'
dtype: string
- name: '238'
dtype: string
- name: '239'
dtype: string
- name: '24'
dtype: string
- name: '240'
dtype: string
- name: '241'
dtype: string
- name: '242'
dtype: string
- name: '243'
dtype: string
- name: '244'
dtype: string
- name: '245'
dtype: string
- name: '246'
dtype: string
- name: '247'
dtype: string
- name: '248'
dtype: string
- name: '249'
dtype: string
- name: '25'
dtype: string
- name: '250'
dtype: string
- name: '251'
dtype: string
- name: '252'
dtype: string
- name: '253'
dtype: string
- name: '254'
dtype: string
- name: '255'
dtype: string
- name: '256'
dtype: string
- name: '257'
dtype: string
- name: '258'
dtype: string
- name: '259'
dtype: string
- name: '26'
dtype: string
- name: '260'
dtype: string
- name: '261'
dtype: string
- name: '262'
dtype: string
- name: '263'
dtype: string
- name: '264'
dtype: string
- name: '265'
dtype: string
- name: '266'
dtype: string
- name: '267'
dtype: string
- name: '268'
dtype: string
- name: '269'
dtype: string
- name: '27'
dtype: string
- name: '270'
dtype: string
- name: '271'
dtype: string
- name: '272'
dtype: string
- name: '273'
dtype: string
- name: '274'
dtype: string
- name: '275'
dtype: string
- name: '276'
dtype: string
- name: '277'
dtype: string
- name: '278'
dtype: string
- name: '279'
dtype: string
- name: '28'
dtype: string
- name: '280'
dtype: string
- name: '281'
dtype: 'null'
- name: '282'
dtype: 'null'
- name: '283'
dtype: 'null'
- name: '284'
dtype: 'null'
- name: '285'
dtype: 'null'
- name: '286'
dtype: 'null'
- name: '287'
dtype: 'null'
- name: '288'
dtype: 'null'
- name: '289'
dtype: 'null'
- name: '29'
dtype: string
- name: '290'
dtype: 'null'
- name: '291'
dtype: 'null'
- name: '292'
dtype: 'null'
- name: '293'
dtype: 'null'
- name: '294'
dtype: 'null'
- name: '295'
dtype: 'null'
- name: '296'
dtype: 'null'
- name: '297'
dtype: 'null'
- name: '298'
dtype: 'null'
- name: '299'
dtype: 'null'
- name: '3'
dtype: string
- name: '30'
dtype: string
- name: '300'
dtype: 'null'
- name: '301'
dtype: 'null'
- name: '302'
dtype: 'null'
- name: '303'
dtype: 'null'
- name: '304'
dtype: 'null'
- name: '305'
dtype: 'null'
- name: '306'
dtype: 'null'
- name: '307'
dtype: 'null'
- name: '308'
dtype: 'null'
- name: '309'
dtype: 'null'
- name: '31'
dtype: string
- name: '310'
dtype: 'null'
- name: '311'
dtype: 'null'
- name: '312'
dtype: 'null'
- name: '313'
dtype: 'null'
- name: '314'
dtype: 'null'
- name: '315'
dtype: 'null'
- name: '316'
dtype: 'null'
- name: '317'
dtype: 'null'
- name: '318'
dtype: 'null'
- name: '319'
dtype: 'null'
- name: '32'
dtype: string
- name: '320'
dtype: 'null'
- name: '321'
dtype: 'null'
- name: '322'
dtype: 'null'
- name: '323'
dtype: 'null'
- name: '324'
dtype: 'null'
- name: '325'
dtype: 'null'
- name: '326'
dtype: 'null'
- name: '327'
dtype: 'null'
- name: '328'
dtype: 'null'
- name: '329'
dtype: 'null'
- name: '33'
dtype: string
- name: '330'
dtype: 'null'
- name: '331'
dtype: 'null'
- name: '332'
dtype: 'null'
- name: '333'
dtype: 'null'
- name: '334'
dtype: 'null'
- name: '335'
dtype: 'null'
- name: '336'
dtype: 'null'
- name: '337'
dtype: 'null'
- name: '338'
dtype: 'null'
- name: '339'
dtype: 'null'
- name: '34'
dtype: string
- name: '340'
dtype: 'null'
- name: '341'
dtype: 'null'
- name: '342'
dtype: 'null'
- name: '343'
dtype: 'null'
- name: '344'
dtype: 'null'
- name: '345'
dtype: 'null'
- name: '346'
dtype: 'null'
- name: '347'
dtype: 'null'
- name: '348'
dtype: 'null'
- name: '349'
dtype: 'null'
- name: '35'
dtype: string
- name: '350'
dtype: 'null'
- name: '351'
dtype: 'null'
- name: '352'
dtype: 'null'
- name: '353'
dtype: 'null'
- name: '354'
dtype: 'null'
- name: '355'
dtype: 'null'
- name: '356'
dtype: 'null'
- name: '357'
dtype: 'null'
- name: '358'
dtype: 'null'
- name: '359'
dtype: 'null'
- name: '36'
dtype: string
- name: '360'
dtype: 'null'
- name: '361'
dtype: 'null'
- name: '362'
dtype: 'null'
- name: '363'
dtype: 'null'
- name: '364'
dtype: 'null'
- name: '365'
dtype: 'null'
- name: '366'
dtype: 'null'
- name: '367'
dtype: 'null'
- name: '368'
dtype: 'null'
- name: '369'
dtype: 'null'
- name: '37'
dtype: string
- name: '370'
dtype: 'null'
- name: '371'
dtype: 'null'
- name: '372'
dtype: 'null'
- name: '373'
dtype: 'null'
- name: '374'
dtype: 'null'
- name: '375'
dtype: 'null'
- name: '376'
dtype: 'null'
- name: '38'
dtype: string
- name: '39'
dtype: string
- name: '4'
dtype: string
- name: '40'
dtype: string
- name: '41'
dtype: string
- name: '42'
dtype: string
- name: '43'
dtype: string
- name: '44'
dtype: string
- name: '45'
dtype: string
- name: '46'
dtype: string
- name: '47'
dtype: string
- name: '48'
dtype: string
- name: '49'
dtype: string
- name: '5'
dtype: string
- name: '50'
dtype: string
- name: '51'
dtype: string
- name: '52'
dtype: string
- name: '53'
dtype: string
- name: '54'
dtype: string
- name: '55'
dtype: string
- name: '56'
dtype: string
- name: '57'
dtype: string
- name: '58'
dtype: string
- name: '59'
dtype: string
- name: '6'
dtype: string
- name: '60'
dtype: string
- name: '61'
dtype: string
- name: '62'
dtype: string
- name: '63'
dtype: string
- name: '64'
dtype: string
- name: '65'
dtype: string
- name: '66'
dtype: string
- name: '67'
dtype: string
- name: '68'
dtype: string
- name: '69'
dtype: string
- name: '7'
dtype: string
- name: '70'
dtype: string
- name: '71'
dtype: string
- name: '72'
dtype: string
- name: '73'
dtype: string
- name: '74'
dtype: string
- name: '75'
dtype: string
- name: '76'
dtype: string
- name: '77'
dtype: string
- name: '78'
dtype: string
- name: '79'
dtype: string
- name: '8'
dtype: string
- name: '80'
dtype: string
- name: '81'
dtype: string
- name: '82'
dtype: string
- name: '83'
dtype: string
- name: '84'
dtype: string
- name: '85'
dtype: string
- name: '86'
dtype: string
- name: '87'
dtype: string
- name: '88'
dtype: string
- name: '89'
dtype: string
- name: '9'
dtype: string
- name: '90'
dtype: string
- name: '91'
dtype: string
- name: '92'
dtype: string
- name: '93'
dtype: string
- name: '94'
dtype: string
- name: '95'
dtype: string
- name: '96'
dtype: string
- name: '97'
dtype: string
- name: '98'
dtype: string
- name: '99'
dtype: string
- name: parse
list:
- name: children
list:
- name: children
list:
- name: children
sequence: 'null'
- name: confidence
dtype: float64
- name: label
dtype: string
- name: span
sequence: int64
- name: confidence
dtype: float64
- name: label
dtype: string
- name: span
sequence: int64
- name: confidence
dtype: float64
- name: label
dtype: string
- name: span
sequence: int64
- name: text
sequence: string
- name: qa_pairs
list:
- name: en_answer
dtype: string
- name: en_answer_tokens
sequence: string
- name: en_match_in_passage
sequence: int64
- name: en_matches_in_source
sequence:
sequence: int64
- name: frames
list:
- name: argument
dtype: string
- name: frame
dtype: string
- name: lang_answer
dtype: string
- name: lang_match_in_passage
sequence: int64
- name: lang_matches_in_source
sequence:
sequence: int64
- name: match_disambiguated_question
dtype: string
- name: passage
sequence: string
- name: passage_id
dtype: string
- name: question
dtype: string
- name: repetitious_translation
dtype: bool
- name: source_lang
dtype: string
- name: source_text
dtype: string
- name: source_url
dtype: string
- name: translation
dtype: string
- name: translation_probs
sequence: string
- name: translation_sents
sequence: string
splits:
- name: train
num_bytes: 50253843
num_examples: 842
download_size: 12279811
dataset_size: 50253843
configs:
- config_name: my
data_files:
- split: my
path: my/my-*
- config_name: my_refined
data_files:
- split: train
path: my_refined/train-*
---
# Dataset Card for "megawika"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_228 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1186307988
num_examples: 231159
download_size: 1208791652
dataset_size: 1186307988
---
# Dataset Card for "chunk_228"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ShenaoZ/0.001_ablation_dataset | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
- name: reference_response
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: is_better
dtype: bool
splits:
- name: train_prefs_1
num_bytes: 168081294
num_examples: 20378
- name: test_prefs_1
num_bytes: 16410846
num_examples: 2000
- name: train_prefs_2
num_bytes: 174168708
num_examples: 20378
- name: test_prefs_2
num_bytes: 16912720
num_examples: 2000
download_size: 207506841
dataset_size: 375573568
configs:
- config_name: default
data_files:
- split: train_prefs_1
path: data/train_prefs_1-*
- split: test_prefs_1
path: data/test_prefs_1-*
- split: train_prefs_2
path: data/train_prefs_2-*
- split: test_prefs_2
path: data/test_prefs_2-*
---
# Dataset Card for "0.001_ablation_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
johnsonlee/My-First-Dataset | ---
license: postgresql
language:
- ch
tags:
- finance
pretty_name: It's a good name.
size_categories:
- 1K<n<10K
---
这是我的第一个dataset |
davidberenstein1957/text2text-10-predictions | ---
dataset_info:
features:
- name: text
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 104132.0
num_examples: 72
- name: test
num_bytes: 26033.0
num_examples: 18
download_size: 86213
dataset_size: 130165.0
---
# Dataset Card for "text2text-10-predictions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
osyvokon/pavlick-formality-scores | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc-by-3.0
multilinguality:
- monolingual
pretty_name: 'Sentence-level formality annotations for news, blogs, email and QA forums.
Published in "An Empirical Analysis of Formality in Online Communication" (Pavlick
and Tetreault, 2016) '
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- text-scoring
---
This dataset contains sentence-level formality annotations used in the 2016
TACL paper "An Empirical Analysis of Formality in Online Communication"
(Pavlick and Tetreault, 2016). It includes sentences from four genres (news,
blogs, email, and QA forums), all annotated by humans on Amazon Mechanical
Turk. The news and blog data was collected by Shibamouli Lahiri, and we are
redistributing it here for the convenience of other researchers. We collected
the email and answers data ourselves, using a similar annotation setup to
Shibamouli.
In the original dataset, `answers` and `email` were tokenized. In this version,
Oleksiy Syvokon detokenized them with `moses-detokenizer` and a bunch of
additional regexps.
If you use this data in your work, please cite BOTH of the below papers:
```
@article{PavlickAndTetreault-2016:TACL,
author = {Ellie Pavlick and Joel Tetreault},
title = {An Empirical Analysis of Formality in Online Communication},
journal = {Transactions of the Association for Computational Linguistics},
year = {2016},
publisher = {Association for Computational Linguistics}
}
@article{Lahiri-2015:arXiv,
title={{SQUINKY! A} Corpus of Sentence-level Formality, Informativeness, and Implicature},
author={Lahiri, Shibamouli},
journal={arXiv preprint arXiv:1506.02306},
year={2015}
}
```
## Contents
The annotated data files and number of lines in each are as follows:
* 4977 answers -- Annotated sentences from a random sample of posts from the Yahoo! Answers forums: https://answers.yahoo.com/
* 1821 blog -- Annotated sentences from the top 100 blogs listed on http://technorati.com/ on October 31, 2009.
* 1701 email -- Annotated sentences from a random sample of emails from the Jeb Bush email archive: http://americanbridgepac.org/jeb-bushs-gubernatorial-email-archive/
* 2775 news -- Annotated sentences from the "breaking", "recent", and "local" news sections of the following 20 news sites: CNN, CBS News, ABC News, Reuters, BBC News Online, New York Times, Los Angeles Times, The Guardian (U.K.), Voice of America, Boston Globe, Chicago Tribune, San Francisco Chronicle, Times Online (U.K.), news.com.au, Xinhua, The Times of India, Seattle Post Intelligencer, Daily Mail, and Bloomberg L.P.
## Format
Each record contains the following fields:
1. `avg_score`: the mean formality rating, which ranges from -3 to 3 where lower scores indicate less formal sentences
2. `sentence`
|
CyberHarem/altina_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of altina (Fire Emblem)
This is the dataset of altina (Fire Emblem), containing 54 images and their tags.
The core tags of this character are `long_hair, blue_eyes, purple_hair, breasts, bangs, large_breasts, very_long_hair, blue_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 54 | 83.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/altina_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 54 | 48.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/altina_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 127 | 90.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/altina_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 54 | 73.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/altina_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 127 | 124.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/altina_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/altina_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, armor, looking_at_viewer, solo, sword, dress, fingerless_gloves, holding, thighhighs, simple_background, white_background, boots, elbow_gloves |
| 1 | 5 |  |  |  |  |  | 1girl, bell, christmas, fur_trim, reindeer_antlers, solo, black_thighhighs, deer_ears, full_body, looking_at_viewer, candy_cane, fake_animal_ears, gift_box, holding_sword, medium_breasts, simple_background, smile, white_footwear, white_gloves, elbow_gloves, low-tied_long_hair, open_mouth, parted_lips |
| 2 | 17 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, solo, navel, smile, hat, blush, abs, collarbone, muscular_female, one-piece_swimsuit, simple_background, white_background, bracelet |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | armor | looking_at_viewer | solo | sword | dress | fingerless_gloves | holding | thighhighs | simple_background | white_background | boots | elbow_gloves | bell | christmas | fur_trim | reindeer_antlers | black_thighhighs | deer_ears | full_body | candy_cane | fake_animal_ears | gift_box | holding_sword | medium_breasts | smile | white_footwear | white_gloves | low-tied_long_hair | open_mouth | parted_lips | cleavage | navel | hat | blush | abs | collarbone | muscular_female | one-piece_swimsuit | bracelet |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------|:--------|:--------|:--------------------|:----------|:-------------|:--------------------|:-------------------|:--------|:---------------|:-------|:------------|:-----------|:-------------------|:-------------------|:------------|:------------|:-------------|:-------------------|:-----------|:----------------|:-----------------|:--------|:-----------------|:---------------|:---------------------|:-------------|:--------------|:-----------|:--------|:------|:--------|:------|:-------------|:------------------|:---------------------|:-----------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | X | X | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 2 | 17 |  |  |  |  |  | X | | X | X | | | | | | X | X | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X |
|
ibranze/araproje_mmlu_tr_conf_gpt2_nearestscore_true_x | ---
dataset_info:
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: validation
num_bytes: 137404.0
num_examples: 250
download_size: 83805
dataset_size: 137404.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_mmlu_tr_conf_gpt2_nearestscore_true_x"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
philschmid/hh-rrhf-dahoas-gptj-rm-25k | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: responses
sequence: string
- name: scores
sequence: float64
splits:
- name: train
num_bytes: 21973591
num_examples: 24983
download_size: 12522534
dataset_size: 21973591
---
# Dataset Card for "hh-rrhf-dahoas-gptj-rm-25k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigscience-data/roots_ar_wiktionary | ---
language: ar
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
|
jiwon65/aihub_general_6000_for_train | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: audio
sequence: float32
splits:
- name: train
num_bytes: 1212419491
num_examples: 6000
download_size: 1071487189
dataset_size: 1212419491
---
# Dataset Card for "korean-general-command-voice_0-6000_samplingRate-16000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibragim-bad/arc_challenge | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
struct:
- name: label
sequence: string
- name: text
sequence: string
- name: answerKey
dtype: string
splits:
- name: test
num_bytes: 375511
num_examples: 1172
- name: train
num_bytes: 349760
num_examples: 1119
- name: validation
num_bytes: 96660
num_examples: 299
download_size: 449682
dataset_size: 821931
---
# Dataset Card for "arc_challenge"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
imvladikon/hebrew_news | ---
annotations_creators:
- no-annotation
language_creators:
- other
language:
- he
license:
- other
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- summarization
task_ids:
- news-articles-summarization
---
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
```
id - article id
articleBody - article main content
description - short version of the article, description of the article
headline - headline of the article
title - title of the article
```
|
open-llm-leaderboard/details_maywell__PiVoT-SUS-RP | ---
pretty_name: Evaluation run of maywell/PiVoT-SUS-RP
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [maywell/PiVoT-SUS-RP](https://huggingface.co/maywell/PiVoT-SUS-RP) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maywell__PiVoT-SUS-RP\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-15T19:33:37.820287](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__PiVoT-SUS-RP/blob/main/results_2024-01-15T19-33-37.820287.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7586143927988348,\n\
\ \"acc_stderr\": 0.028201585690631335,\n \"acc_norm\": 0.7620604659128375,\n\
\ \"acc_norm_stderr\": 0.028744091509590272,\n \"mc1\": 0.3843329253365973,\n\
\ \"mc1_stderr\": 0.017028707301245203,\n \"mc2\": 0.5456609919227617,\n\
\ \"mc2_stderr\": 0.014778672831926782\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.643344709897611,\n \"acc_stderr\": 0.013998056902620192,\n\
\ \"acc_norm\": 0.6655290102389079,\n \"acc_norm_stderr\": 0.013787460322441375\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6398127862975503,\n\
\ \"acc_stderr\": 0.00479073468370459,\n \"acc_norm\": 0.8422624975104561,\n\
\ \"acc_norm_stderr\": 0.0036374977089340356\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7111111111111111,\n\
\ \"acc_stderr\": 0.0391545063041425,\n \"acc_norm\": 0.7111111111111111,\n\
\ \"acc_norm_stderr\": 0.0391545063041425\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.9013157894736842,\n \"acc_stderr\": 0.024270227737522715,\n\
\ \"acc_norm\": 0.9013157894736842,\n \"acc_norm_stderr\": 0.024270227737522715\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8113207547169812,\n \"acc_stderr\": 0.024079995130062253,\n\
\ \"acc_norm\": 0.8113207547169812,\n \"acc_norm_stderr\": 0.024079995130062253\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9097222222222222,\n\
\ \"acc_stderr\": 0.023964965777906935,\n \"acc_norm\": 0.9097222222222222,\n\
\ \"acc_norm_stderr\": 0.023964965777906935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7341040462427746,\n\
\ \"acc_stderr\": 0.03368762932259431,\n \"acc_norm\": 0.7341040462427746,\n\
\ \"acc_norm_stderr\": 0.03368762932259431\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7787234042553192,\n \"acc_stderr\": 0.027136349602424063,\n\
\ \"acc_norm\": 0.7787234042553192,\n \"acc_norm_stderr\": 0.027136349602424063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5789473684210527,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.5789473684210527,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7655172413793103,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.7655172413793103,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6931216931216931,\n \"acc_stderr\": 0.023752928712112136,\n \"\
acc_norm\": 0.6931216931216931,\n \"acc_norm_stderr\": 0.023752928712112136\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5952380952380952,\n\
\ \"acc_stderr\": 0.04390259265377563,\n \"acc_norm\": 0.5952380952380952,\n\
\ \"acc_norm_stderr\": 0.04390259265377563\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.8935483870967742,\n \"acc_stderr\": 0.01754510295165663,\n\
\ \"acc_norm\": 0.8935483870967742,\n \"acc_norm_stderr\": 0.01754510295165663\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6551724137931034,\n \"acc_stderr\": 0.03344283744280458,\n \"\
acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03344283744280458\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\"\
: 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706463,\n\
\ \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706463\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9242424242424242,\n \"acc_stderr\": 0.018852670234993093,\n \"\
acc_norm\": 0.9242424242424242,\n \"acc_norm_stderr\": 0.018852670234993093\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527033,\n\
\ \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527033\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8153846153846154,\n \"acc_stderr\": 0.019671632413100295,\n\
\ \"acc_norm\": 0.8153846153846154,\n \"acc_norm_stderr\": 0.019671632413100295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.43333333333333335,\n \"acc_stderr\": 0.030213340289237927,\n \
\ \"acc_norm\": 0.43333333333333335,\n \"acc_norm_stderr\": 0.030213340289237927\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8697478991596639,\n \"acc_stderr\": 0.021863258494852118,\n\
\ \"acc_norm\": 0.8697478991596639,\n \"acc_norm_stderr\": 0.021863258494852118\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.48344370860927155,\n \"acc_stderr\": 0.040802441856289715,\n \"\
acc_norm\": 0.48344370860927155,\n \"acc_norm_stderr\": 0.040802441856289715\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9211009174311927,\n \"acc_stderr\": 0.011558198113769578,\n \"\
acc_norm\": 0.9211009174311927,\n \"acc_norm_stderr\": 0.011558198113769578\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6527777777777778,\n \"acc_stderr\": 0.032468872436376486,\n \"\
acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.032468872436376486\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9113924050632911,\n \"acc_stderr\": 0.018498315206865384,\n \
\ \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.018498315206865384\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n\
\ \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n\
\ \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8473282442748091,\n \"acc_stderr\": 0.031545216720054725,\n\
\ \"acc_norm\": 0.8473282442748091,\n \"acc_norm_stderr\": 0.031545216720054725\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622793,\n \"\
acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622793\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.03038159675665168,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.03038159675665168\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8773006134969326,\n \"acc_stderr\": 0.025777328426978927,\n\
\ \"acc_norm\": 0.8773006134969326,\n \"acc_norm_stderr\": 0.025777328426978927\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.912621359223301,\n \"acc_stderr\": 0.027960689125970654,\n\
\ \"acc_norm\": 0.912621359223301,\n \"acc_norm_stderr\": 0.027960689125970654\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n\
\ \"acc_stderr\": 0.016534627684311364,\n \"acc_norm\": 0.9316239316239316,\n\
\ \"acc_norm_stderr\": 0.016534627684311364\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263714,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263714\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9054916985951469,\n\
\ \"acc_stderr\": 0.010461015338193068,\n \"acc_norm\": 0.9054916985951469,\n\
\ \"acc_norm_stderr\": 0.010461015338193068\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8121387283236994,\n \"acc_stderr\": 0.021029269752423228,\n\
\ \"acc_norm\": 0.8121387283236994,\n \"acc_norm_stderr\": 0.021029269752423228\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6212290502793296,\n\
\ \"acc_stderr\": 0.016223533510365123,\n \"acc_norm\": 0.6212290502793296,\n\
\ \"acc_norm_stderr\": 0.016223533510365123\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8562091503267973,\n \"acc_stderr\": 0.020091188936043693,\n\
\ \"acc_norm\": 0.8562091503267973,\n \"acc_norm_stderr\": 0.020091188936043693\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8231511254019293,\n\
\ \"acc_stderr\": 0.0216700588855108,\n \"acc_norm\": 0.8231511254019293,\n\
\ \"acc_norm_stderr\": 0.0216700588855108\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.018689725721062072,\n\
\ \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.018689725721062072\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6347517730496454,\n \"acc_stderr\": 0.028723863853281267,\n \
\ \"acc_norm\": 0.6347517730496454,\n \"acc_norm_stderr\": 0.028723863853281267\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6075619295958279,\n\
\ \"acc_stderr\": 0.012471243669229096,\n \"acc_norm\": 0.6075619295958279,\n\
\ \"acc_norm_stderr\": 0.012471243669229096\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.02315746830855936,\n\
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02315746830855936\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8088235294117647,\n \"acc_stderr\": 0.015908290136278036,\n \
\ \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.015908290136278036\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8326530612244898,\n \"acc_stderr\": 0.02389714476891452,\n\
\ \"acc_norm\": 0.8326530612244898,\n \"acc_norm_stderr\": 0.02389714476891452\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n\
\ \"acc_stderr\": 0.022076326101824664,\n \"acc_norm\": 0.8905472636815921,\n\
\ \"acc_norm_stderr\": 0.022076326101824664\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n\
\ \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n\
\ \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.9005847953216374,\n \"acc_stderr\": 0.022949025579355027,\n\
\ \"acc_norm\": 0.9005847953216374,\n \"acc_norm_stderr\": 0.022949025579355027\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3843329253365973,\n\
\ \"mc1_stderr\": 0.017028707301245203,\n \"mc2\": 0.5456609919227617,\n\
\ \"mc2_stderr\": 0.014778672831926782\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.010470796496781105\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7050796057619408,\n \
\ \"acc_stderr\": 0.012560698010954762\n }\n}\n```"
repo_url: https://huggingface.co/maywell/PiVoT-SUS-RP
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|arc:challenge|25_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|gsm8k|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hellaswag|10_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T19-33-37.820287.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-15T19-33-37.820287.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- '**/details_harness|winogrande|5_2024-01-15T19-33-37.820287.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-15T19-33-37.820287.parquet'
- config_name: results
data_files:
- split: 2024_01_15T19_33_37.820287
path:
- results_2024-01-15T19-33-37.820287.parquet
- split: latest
path:
- results_2024-01-15T19-33-37.820287.parquet
---
# Dataset Card for Evaluation run of maywell/PiVoT-SUS-RP
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [maywell/PiVoT-SUS-RP](https://huggingface.co/maywell/PiVoT-SUS-RP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_maywell__PiVoT-SUS-RP",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-15T19:33:37.820287](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__PiVoT-SUS-RP/blob/main/results_2024-01-15T19-33-37.820287.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7586143927988348,
"acc_stderr": 0.028201585690631335,
"acc_norm": 0.7620604659128375,
"acc_norm_stderr": 0.028744091509590272,
"mc1": 0.3843329253365973,
"mc1_stderr": 0.017028707301245203,
"mc2": 0.5456609919227617,
"mc2_stderr": 0.014778672831926782
},
"harness|arc:challenge|25": {
"acc": 0.643344709897611,
"acc_stderr": 0.013998056902620192,
"acc_norm": 0.6655290102389079,
"acc_norm_stderr": 0.013787460322441375
},
"harness|hellaswag|10": {
"acc": 0.6398127862975503,
"acc_stderr": 0.00479073468370459,
"acc_norm": 0.8422624975104561,
"acc_norm_stderr": 0.0036374977089340356
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7111111111111111,
"acc_stderr": 0.0391545063041425,
"acc_norm": 0.7111111111111111,
"acc_norm_stderr": 0.0391545063041425
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9013157894736842,
"acc_stderr": 0.024270227737522715,
"acc_norm": 0.9013157894736842,
"acc_norm_stderr": 0.024270227737522715
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8113207547169812,
"acc_stderr": 0.024079995130062253,
"acc_norm": 0.8113207547169812,
"acc_norm_stderr": 0.024079995130062253
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9097222222222222,
"acc_stderr": 0.023964965777906935,
"acc_norm": 0.9097222222222222,
"acc_norm_stderr": 0.023964965777906935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.03368762932259431,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.03368762932259431
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7787234042553192,
"acc_stderr": 0.027136349602424063,
"acc_norm": 0.7787234042553192,
"acc_norm_stderr": 0.027136349602424063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7655172413793103,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.7655172413793103,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6931216931216931,
"acc_stderr": 0.023752928712112136,
"acc_norm": 0.6931216931216931,
"acc_norm_stderr": 0.023752928712112136
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5952380952380952,
"acc_stderr": 0.04390259265377563,
"acc_norm": 0.5952380952380952,
"acc_norm_stderr": 0.04390259265377563
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8935483870967742,
"acc_stderr": 0.01754510295165663,
"acc_norm": 0.8935483870967742,
"acc_norm_stderr": 0.01754510295165663
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706463,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706463
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9242424242424242,
"acc_stderr": 0.018852670234993093,
"acc_norm": 0.9242424242424242,
"acc_norm_stderr": 0.018852670234993093
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527033,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527033
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8153846153846154,
"acc_stderr": 0.019671632413100295,
"acc_norm": 0.8153846153846154,
"acc_norm_stderr": 0.019671632413100295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.43333333333333335,
"acc_stderr": 0.030213340289237927,
"acc_norm": 0.43333333333333335,
"acc_norm_stderr": 0.030213340289237927
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8697478991596639,
"acc_stderr": 0.021863258494852118,
"acc_norm": 0.8697478991596639,
"acc_norm_stderr": 0.021863258494852118
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.48344370860927155,
"acc_stderr": 0.040802441856289715,
"acc_norm": 0.48344370860927155,
"acc_norm_stderr": 0.040802441856289715
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9211009174311927,
"acc_stderr": 0.011558198113769578,
"acc_norm": 0.9211009174311927,
"acc_norm_stderr": 0.011558198113769578
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.018498315206865384,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.018498315206865384
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8473282442748091,
"acc_stderr": 0.031545216720054725,
"acc_norm": 0.8473282442748091,
"acc_norm_stderr": 0.031545216720054725
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622793,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622793
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665168,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665168
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8773006134969326,
"acc_stderr": 0.025777328426978927,
"acc_norm": 0.8773006134969326,
"acc_norm_stderr": 0.025777328426978927
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5625,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.912621359223301,
"acc_stderr": 0.027960689125970654,
"acc_norm": 0.912621359223301,
"acc_norm_stderr": 0.027960689125970654
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9316239316239316,
"acc_stderr": 0.016534627684311364,
"acc_norm": 0.9316239316239316,
"acc_norm_stderr": 0.016534627684311364
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263714,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263714
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9054916985951469,
"acc_stderr": 0.010461015338193068,
"acc_norm": 0.9054916985951469,
"acc_norm_stderr": 0.010461015338193068
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8121387283236994,
"acc_stderr": 0.021029269752423228,
"acc_norm": 0.8121387283236994,
"acc_norm_stderr": 0.021029269752423228
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6212290502793296,
"acc_stderr": 0.016223533510365123,
"acc_norm": 0.6212290502793296,
"acc_norm_stderr": 0.016223533510365123
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8562091503267973,
"acc_stderr": 0.020091188936043693,
"acc_norm": 0.8562091503267973,
"acc_norm_stderr": 0.020091188936043693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8231511254019293,
"acc_stderr": 0.0216700588855108,
"acc_norm": 0.8231511254019293,
"acc_norm_stderr": 0.0216700588855108
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.018689725721062072,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.018689725721062072
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6347517730496454,
"acc_stderr": 0.028723863853281267,
"acc_norm": 0.6347517730496454,
"acc_norm_stderr": 0.028723863853281267
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6075619295958279,
"acc_stderr": 0.012471243669229096,
"acc_norm": 0.6075619295958279,
"acc_norm_stderr": 0.012471243669229096
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.02315746830855936,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.02315746830855936
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.015908290136278036,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.015908290136278036
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8326530612244898,
"acc_stderr": 0.02389714476891452,
"acc_norm": 0.8326530612244898,
"acc_norm_stderr": 0.02389714476891452
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824664,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824664
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9005847953216374,
"acc_stderr": 0.022949025579355027,
"acc_norm": 0.9005847953216374,
"acc_norm_stderr": 0.022949025579355027
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3843329253365973,
"mc1_stderr": 0.017028707301245203,
"mc2": 0.5456609919227617,
"mc2_stderr": 0.014778672831926782
},
"harness|winogrande|5": {
"acc": 0.8334648776637726,
"acc_stderr": 0.010470796496781105
},
"harness|gsm8k|5": {
"acc": 0.7050796057619408,
"acc_stderr": 0.012560698010954762
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mteb/twentynewsgroups-clustering | ---
language:
- en
--- |
X-LANCE/WebSRC_v1.0 | ---
license: cc-by-4.0
---
# WebSRC v1.0
WebSRC v1.0 is a dataset for reading comprehension on structural web pages.
The task is to answer questions about web pages, which requires a system to
have a comprehensive understanding of the spatial structure and logical
structure. WebSRC consists of 6.4K web pages and 400K question-answer pairs
about web pages. For each web page, we manually chose one segment from it
and saved the corresponding HTML code, screenshot, and metadata like
positions and sizes. Questions in WebSRC were created for each segment.
Answers are either text spans from web pages or yes/no. Taking the HTML
code, screenshot, metadata as well as question as input, a model is to
predict the answer from the web page. Our dataset is the first one that
provides HTML documents as well as images, and is larger in the number of
domains and queries.
For more details, please refer to our paper [WebSRC: A Dataset for Web-Based Structural Reading Comprehension](https://arxiv.org/abs/2101.09465).
The Leaderboard of WebSRC v1.0 can be found [here](https://x-lance.github.io/WebSRC/).
## Data Format Description
The dataset for each website will be stored in `dataset.csv` in the directory
`{domain-name}/{website-number}`. The corresponding raw data (including HTML
files, screenshots, bounding box coordinates, and page names and urls) is
stored in the `processed_data` folder in the same directory.
In `dataset.csv`, each row corresponds to one question-answer data point
except the header. The meanings of each column are as follows:
* `question`: a string, the question of this question-answer data point.
* `id`: a unique id for this question-answer data point. Each `id` has a length 14, the first two characters are the domain indicator, the following two number is the website name. The corresponding page id can be extracted by `id[2:9]`, for example, id "sp160000100001" means this line is created from the *sport* domain, website *16*, and the corresponding page is `1600001.html`.
* `element_id`: an integer, the tag id (corresponding to the tag's `tid` attribute in the HTML files) of the deepest tag in the DOM tree which contain all the answer. For yes/no question, there is no tag associated with the answer, so the `element_id` is -1.
* `answer_start`: an integer, the char offset of the answer from the start of the content of the tag specified by `element_id`. Note that before counting this number, we first eliminate all the inner tags in the specified tag and replace all the consecutive whitespaces with one space. For yes/no questions, `answer_start` is 1 for answer "yes" and 0 for answer "no".
* `answer`: a string, the answer of this question-answer data point.
## Data Statistics
We roughly divided the questions in WebSRC v1.0 into three categories: KV,
Compare, and Table. The detailed definitions can be found in our
[paper](https://arxiv.org/abs/2101.09465). The numbers of websites, webpages,
and QAs corresponding to the three categories are as follows:
Type | # Websites | # Webpages | # QAs
---- | ---------- | ---------- | -----
KV | 34 | 3,207 | 168,606
Comparison | 15 | 1,339 | 68,578
Table | 21 | 1,901 | 163,314
The statistics of the dataset splits are as follows:
Split | # Websites | # Webpages | # QAs
----- | ---------- | ---------- | -----
Train | 50 | 4,549 | 307,315
Dev | 10 | 913 | 52,826
Test | 10 | 985 | 40,357
## Obtain Test Result
For test set evaluation, please send your prediction files to
zhao_mengxin@sjtu.edu.cn and chenlusz@sjtu.edu.cn with title "WebSRC Test:
\<your model name\>+\<your institution\>". For evaluation, the prediction
files should contain two files:
```jsonc
// prediction.json
// A json format file, keys are ids and values are the predicted answers (string).
{
"sp160000100001": "predicted answer",
"sp160000100002": "...",
//...
}
// tag_prediction.json
// A json format file, keys are ids and values are the predicted tag tid (int)
{
"sp160000100001": -1,
"sp160000100002": -1,
//...
}
```
We encourage to submit results from **at least three runs with different random
seeds** to reduce the uncertainty of the experiments. Please place prediction files
for each run in different directories and submit a zipped file. The average test
result will be sent by email.
## Reference
If you use any source codes or datasets included in this repository in your work,
please cite the corresponding papers. The bibtex are listed below:
```text
@inproceedings{chen-etal-2021-websrc,
title = "{W}eb{SRC}: A Dataset for Web-Based Structural Reading Comprehension",
author = "Chen, Xingyu and
Zhao, Zihan and
Chen, Lu and
Ji, JiaBao and
Zhang, Danyang and
Luo, Ao and
Xiong, Yuxuan and
Yu, Kai",
booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
month = nov,
year = "2021",
address = "Online and Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.emnlp-main.343",
pages = "4173--4185",
abstract = "Web search is an essential way for humans to obtain information, but it{'}s still a great challenge for machines to understand the contents of web pages. In this paper, we introduce the task of web-based structural reading comprehension. Given a web page and a question about it, the task is to find an answer from the web page. This task requires a system not only to understand the semantics of texts but also the structure of the web page. Moreover, we proposed WebSRC, a novel Web-based Structural Reading Comprehension dataset. WebSRC consists of 400K question-answer pairs, which are collected from 6.4K web pages with corresponding HTML source code, screenshots, and metadata. Each question in WebSRC requires a certain structural understanding of a web page to answer, and the answer is either a text span on the web page or yes/no. We evaluate various strong baselines on our dataset to show the difficulty of our task. We also investigate the usefulness of structural information and visual features. Our dataset and baselines have been publicly available.",
}
```
|
open-llm-leaderboard/details_EleutherAI__gpt-neo-125m | ---
pretty_name: Evaluation run of EleutherAI/gpt-neo-125m
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [EleutherAI/gpt-neo-125m](https://huggingface.co/EleutherAI/gpt-neo-125m) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__gpt-neo-125m\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T09:42:25.890470](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__gpt-neo-125m/blob/main/results_2023-10-18T09-42-25.890470.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n\
\ \"em_stderr\": 0.0004191330178826801,\n \"f1\": 0.03690436241610747,\n\
\ \"f1_stderr\": 0.0011592977848577672,\n \"acc\": 0.2603955425321017,\n\
\ \"acc_stderr\": 0.007779096578699754\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.0004191330178826801,\n\
\ \"f1\": 0.03690436241610747,\n \"f1_stderr\": 0.0011592977848577672\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \
\ \"acc_stderr\": 0.0015145735612245494\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5177584846093133,\n \"acc_stderr\": 0.014043619596174959\n\
\ }\n}\n```"
repo_url: https://huggingface.co/EleutherAI/gpt-neo-125m
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T09_42_25.890470
path:
- '**/details_harness|drop|3_2023-10-18T09-42-25.890470.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T09-42-25.890470.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T09_42_25.890470
path:
- '**/details_harness|gsm8k|5_2023-10-18T09-42-25.890470.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T09-42-25.890470.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:58:00.274896.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:58:00.274896.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:58:00.274896.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T09_42_25.890470
path:
- '**/details_harness|winogrande|5_2023-10-18T09-42-25.890470.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T09-42-25.890470.parquet'
- config_name: results
data_files:
- split: 2023_07_19T13_58_00.274896
path:
- results_2023-07-19T13:58:00.274896.parquet
- split: 2023_10_18T09_42_25.890470
path:
- results_2023-10-18T09-42-25.890470.parquet
- split: latest
path:
- results_2023-10-18T09-42-25.890470.parquet
---
# Dataset Card for Evaluation run of EleutherAI/gpt-neo-125m
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/gpt-neo-125m
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/gpt-neo-125m](https://huggingface.co/EleutherAI/gpt-neo-125m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__gpt-neo-125m",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T09:42:25.890470](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__gpt-neo-125m/blob/main/results_2023-10-18T09-42-25.890470.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.0004191330178826801,
"f1": 0.03690436241610747,
"f1_stderr": 0.0011592977848577672,
"acc": 0.2603955425321017,
"acc_stderr": 0.007779096578699754
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.0004191330178826801,
"f1": 0.03690436241610747,
"f1_stderr": 0.0011592977848577672
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245494
},
"harness|winogrande|5": {
"acc": 0.5177584846093133,
"acc_stderr": 0.014043619596174959
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
adamo1139/toxic-dpo-natural-v3 | ---
license: other
license_name: other
license_link: LICENSE
---
|
jason-lee08/TinyStories_Mars | ---
dataset_info:
features:
- name: prompts
dtype: string
- name: stories
dtype: string
splits:
- name: train
num_bytes: 5569066
num_examples: 3364
download_size: 2292233
dataset_size: 5569066
---
# Dataset Card for "TinyStories_Mars"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TinyPixel/air | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: response
dtype: string
- name: category
dtype: string
- name: question_id
dtype: float64
splits:
- name: train
num_bytes: 43852716
num_examples: 27729
download_size: 23510335
dataset_size: 43852716
---
# Dataset Card for "air"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
d0rj/OpenOrca-ru | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 11568757682
num_examples: 4233923
download_size: 5699482220
dataset_size: 11568757682
size_categories:
- 1M<n<10M
language_creators:
- translated
language:
- ru
multilinguality:
- monolingual
pretty_name: Dolphin (ru)
source_datasets:
- Open-Orca/OpenOrca
license: mit
tags:
- ChatGPT
- instruct
- instruct-tune
task_categories:
- conversational
- text-classification
- token-classification
- table-question-answering
- question-answering
- zero-shot-classification
- summarization
- feature-extraction
- text-generation
- text2text-generation
paperswithcode_id: orca-progressive-learning-from-complex
---
# OpenOrca-ru
## Dataset Description
- **Paper:** https://arxiv.org/abs/2306.02707
This is translated version of [Open-Orca/OpenOrca](https://huggingface.co/datasets/Open-Orca/OpenOrca) into Russian. |
distilled-one-sec-cv12-each-chunk-uniq/chunk_142 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1078700212.0
num_examples: 210191
download_size: 1104583624
dataset_size: 1078700212.0
---
# Dataset Card for "chunk_142"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
totally-not-an-llm/EverythingLM-data-V3 | ---
license: mit
---
# EverythingLM V3 Dataset
**EverythingLM V3** is a diverse instruct dataset consisting of roughly 1.1k of sysprompt-user-assistant triads. These were generated using principles from both evol-instruct and Orca. The dataset encompasses a wide array of topics and interactions.
### Diferences from V2
* Used march gpt-4 instead of latest
* Dynamically adjusted temperature based on the task
* Much more diverse (8 new categories)
* Flesch hints
* 10% more data
* Better filtering
* Overall refined dataset generation pipeline
### Category distribution

\*These values represent the data as generated, but slight filtering has been applied, so values might be a bit different. |
hml707/test | ---
license: apache-2.0
---
|
kristmh/high_vs_randommin_100_issues_per_repo | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validate
path: data/validate-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: text_clean
dtype: string
- name: label
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: test
num_bytes: 13172272
num_examples: 11681
- name: train
num_bytes: 103353722
num_examples: 93441
- name: validate
num_bytes: 13025230
num_examples: 11680
download_size: 61083853
dataset_size: 129551224
---
# Dataset Card for "high_vs_randommin_100_issues_per_repo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
japanese-asr/whisper_transcriptions.reazonspeech.all_60 | ---
dataset_info:
config_name: all
features:
- name: name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 29261046421.0
num_examples: 256966
download_size: 29028844193
dataset_size: 29261046421.0
configs:
- config_name: all
data_files:
- split: train
path: all/train-*
---
|
greathero/evenmorex13-newothersmallerthreeclass-newercontrailsvalidationdataset | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 15125878.8
num_examples: 1800
download_size: 3155745
dataset_size: 15125878.8
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joshuasundance/mtg-coloridentity-multilabel-classification | ---
dataset_info:
features:
- name: text
dtype: string
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 5011317.077050539
num_examples: 22208
- name: test
num_bytes: 1253054.9229494615
num_examples: 5553
download_size: 2405205
dataset_size: 6264372
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
license: mit
task_categories:
- text-classification
task_ids:
- multi-label-classification
language:
- en
tags:
- mtg
- multilabel
- magic
pretty_name: Magic the Gathering Color Identity Multilabel Classification
size_categories:
- 10K<n<100K
---
This dataset was made specifically for multilabel classification using the following process:
1. Downloading https://mtgjson.com/api/v5/AtomicCards.json.bz2 on January 10, 2024
2. Encoding color identity of each card into the `labels` feature
```python
colors = ['B', 'G', 'R', 'U', 'W']
b = [1, 0, 0, 0, 0]
bw = [1, 0, 0, 0, 1]
gru = [0, 1, 1, 1, 0]
# and so on
```
3. Concatenating card name and card text into the `text` feature
4. `split = ds['train'].train_test_split(test_size=0.2)`
5. `split.push_to_hub("mtg-coloridentity-multilabel-classification")` |
TinyPixel/fish-1 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4680929994
num_examples: 2840090
download_size: 2704444515
dataset_size: 4680929994
---
# Dataset Card for "fish-1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jhaberbe/lipid-droplets-v2 | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 12799326.0
num_examples: 10
download_size: 12808424
dataset_size: 12799326.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MicPie/unpredictable_cluster24 | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: UnpredicTable-cluster24
size_categories:
- 100K<n<1M
source_datasets: []
task_categories:
- multiple-choice
- question-answering
- zero-shot-classification
- text2text-generation
- table-question-answering
- text-generation
- text-classification
- tabular-classification
task_ids:
- multiple-choice-qa
- extractive-qa
- open-domain-qa
- closed-domain-qa
- closed-book-qa
- open-book-qa
- language-modeling
- multi-class-classification
- natural-language-inference
- topic-classification
- multi-label-classification
- tabular-multi-class-classification
- tabular-multi-label-classification
---
# Dataset Card for "UnpredicTable-cluster24" - Dataset of Few-shot Tasks from Tables
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://ethanperez.net/unpredictable
- **Repository:** https://github.com/JunShern/few-shot-adaptation
- **Paper:** Few-shot Adaptation Works with UnpredicTable Data
- **Point of Contact:** junshern@nyu.edu, perez@nyu.edu
### Dataset Summary
The UnpredicTable dataset consists of web tables formatted as few-shot tasks for fine-tuning language models to improve their few-shot performance.
There are several dataset versions available:
* [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full): Starting from the initial WTC corpus of 50M tables, we apply our tables-to-tasks procedure to produce our resulting dataset, [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full), which comprises 413,299 tasks from 23,744 unique websites.
* [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique): This is the same as [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full) but filtered to have a maximum of one task per website. [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique) contains exactly 23,744 tasks from 23,744 websites.
* [UnpredicTable-5k](https://huggingface.co/datasets/MicPie/unpredictable_5k): This dataset contains 5k random tables from the full dataset.
* UnpredicTable data subsets based on a manual human quality rating (please see our publication for details of the ratings):
* [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low)
* [UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium)
* [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high)
* UnpredicTable data subsets based on the website of origin:
* [UnpredicTable-baseball-fantasysports-yahoo-com](https://huggingface.co/datasets/MicPie/unpredictable_baseball-fantasysports-yahoo-com)
* [UnpredicTable-bulbapedia-bulbagarden-net](https://huggingface.co/datasets/MicPie/unpredictable_bulbapedia-bulbagarden-net)
* [UnpredicTable-cappex-com](https://huggingface.co/datasets/MicPie/unpredictable_cappex-com)
* [UnpredicTable-cram-com](https://huggingface.co/datasets/MicPie/unpredictable_cram-com)
* [UnpredicTable-dividend-com](https://huggingface.co/datasets/MicPie/unpredictable_dividend-com)
* [UnpredicTable-dummies-com](https://huggingface.co/datasets/MicPie/unpredictable_dummies-com)
* [UnpredicTable-en-wikipedia-org](https://huggingface.co/datasets/MicPie/unpredictable_en-wikipedia-org)
* [UnpredicTable-ensembl-org](https://huggingface.co/datasets/MicPie/unpredictable_ensembl-org)
* [UnpredicTable-gamefaqs-com](https://huggingface.co/datasets/MicPie/unpredictable_gamefaqs-com)
* [UnpredicTable-mgoblog-com](https://huggingface.co/datasets/MicPie/unpredictable_mgoblog-com)
* [UnpredicTable-mmo-champion-com](https://huggingface.co/datasets/MicPie/unpredictable_mmo-champion-com)
* [UnpredicTable-msdn-microsoft-com](https://huggingface.co/datasets/MicPie/unpredictable_msdn-microsoft-com)
* [UnpredicTable-phonearena-com](https://huggingface.co/datasets/MicPie/unpredictable_phonearena-com)
* [UnpredicTable-sittercity-com](https://huggingface.co/datasets/MicPie/unpredictable_sittercity-com)
* [UnpredicTable-sporcle-com](https://huggingface.co/datasets/MicPie/unpredictable_sporcle-com)
* [UnpredicTable-studystack-com](https://huggingface.co/datasets/MicPie/unpredictable_studystack-com)
* [UnpredicTable-support-google-com](https://huggingface.co/datasets/MicPie/unpredictable_support-google-com)
* [UnpredicTable-w3-org](https://huggingface.co/datasets/MicPie/unpredictable_w3-org)
* [UnpredicTable-wiki-openmoko-org](https://huggingface.co/datasets/MicPie/unpredictable_wiki-openmoko-org)
* [UnpredicTable-wkdu-org](https://huggingface.co/datasets/MicPie/unpredictable_wkdu-org)
* UnpredicTable data subsets based on clustering (for the clustering details please see our publication):
* [UnpredicTable-cluster00](https://huggingface.co/datasets/MicPie/unpredictable_cluster00)
* [UnpredicTable-cluster01](https://huggingface.co/datasets/MicPie/unpredictable_cluster01)
* [UnpredicTable-cluster02](https://huggingface.co/datasets/MicPie/unpredictable_cluster02)
* [UnpredicTable-cluster03](https://huggingface.co/datasets/MicPie/unpredictable_cluster03)
* [UnpredicTable-cluster04](https://huggingface.co/datasets/MicPie/unpredictable_cluster04)
* [UnpredicTable-cluster05](https://huggingface.co/datasets/MicPie/unpredictable_cluster05)
* [UnpredicTable-cluster06](https://huggingface.co/datasets/MicPie/unpredictable_cluster06)
* [UnpredicTable-cluster07](https://huggingface.co/datasets/MicPie/unpredictable_cluster07)
* [UnpredicTable-cluster08](https://huggingface.co/datasets/MicPie/unpredictable_cluster08)
* [UnpredicTable-cluster09](https://huggingface.co/datasets/MicPie/unpredictable_cluster09)
* [UnpredicTable-cluster10](https://huggingface.co/datasets/MicPie/unpredictable_cluster10)
* [UnpredicTable-cluster11](https://huggingface.co/datasets/MicPie/unpredictable_cluster11)
* [UnpredicTable-cluster12](https://huggingface.co/datasets/MicPie/unpredictable_cluster12)
* [UnpredicTable-cluster13](https://huggingface.co/datasets/MicPie/unpredictable_cluster13)
* [UnpredicTable-cluster14](https://huggingface.co/datasets/MicPie/unpredictable_cluster14)
* [UnpredicTable-cluster15](https://huggingface.co/datasets/MicPie/unpredictable_cluster15)
* [UnpredicTable-cluster16](https://huggingface.co/datasets/MicPie/unpredictable_cluster16)
* [UnpredicTable-cluster17](https://huggingface.co/datasets/MicPie/unpredictable_cluster17)
* [UnpredicTable-cluster18](https://huggingface.co/datasets/MicPie/unpredictable_cluster18)
* [UnpredicTable-cluster19](https://huggingface.co/datasets/MicPie/unpredictable_cluster19)
* [UnpredicTable-cluster20](https://huggingface.co/datasets/MicPie/unpredictable_cluster20)
* [UnpredicTable-cluster21](https://huggingface.co/datasets/MicPie/unpredictable_cluster21)
* [UnpredicTable-cluster22](https://huggingface.co/datasets/MicPie/unpredictable_cluster22)
* [UnpredicTable-cluster23](https://huggingface.co/datasets/MicPie/unpredictable_cluster23)
* [UnpredicTable-cluster24](https://huggingface.co/datasets/MicPie/unpredictable_cluster24)
* [UnpredicTable-cluster25](https://huggingface.co/datasets/MicPie/unpredictable_cluster25)
* [UnpredicTable-cluster26](https://huggingface.co/datasets/MicPie/unpredictable_cluster26)
* [UnpredicTable-cluster27](https://huggingface.co/datasets/MicPie/unpredictable_cluster27)
* [UnpredicTable-cluster28](https://huggingface.co/datasets/MicPie/unpredictable_cluster28)
* [UnpredicTable-cluster29](https://huggingface.co/datasets/MicPie/unpredictable_cluster29)
* [UnpredicTable-cluster-noise](https://huggingface.co/datasets/MicPie/unpredictable_cluster-noise)
### Supported Tasks and Leaderboards
Since the tables come from the web, the distribution of tasks and topics is very broad. The shape of our dataset is very wide, i.e., we have 1000's of tasks, while each task has only a few examples, compared to most current NLP datasets which are very deep, i.e., 10s of tasks with many examples. This implies that our dataset covers a broad range of potential tasks, e.g., multiple-choice, question-answering, table-question-answering, text-classification, etc.
The intended use of this dataset is to improve few-shot performance by fine-tuning/pre-training on our dataset.
### Languages
English
## Dataset Structure
### Data Instances
Each task is represented as a jsonline file and consists of several few-shot examples. Each example is a dictionary containing a field 'task', which identifies the task, followed by an 'input', 'options', and 'output' field. The 'input' field contains several column elements of the same row in the table, while the 'output' field is a target which represents an individual column of the same row. Each task contains several such examples which can be concatenated as a few-shot task. In the case of multiple choice classification, the 'options' field contains the possible classes that a model needs to choose from.
There are also additional meta-data fields such as 'pageTitle', 'title', 'outputColName', 'url', 'wdcFile'.
### Data Fields
'task': task identifier
'input': column elements of a specific row in the table.
'options': for multiple choice classification, it provides the options to choose from.
'output': target column element of the same row as input.
'pageTitle': the title of the page containing the table.
'outputColName': output column name
'url': url to the website containing the table
'wdcFile': WDC Web Table Corpus file
### Data Splits
The UnpredicTable datasets do not come with additional data splits.
## Dataset Creation
### Curation Rationale
Few-shot training on multi-task datasets has been demonstrated to improve language models' few-shot learning (FSL) performance on new tasks, but it is unclear which training tasks lead to effective downstream task adaptation. Few-shot learning datasets are typically produced with expensive human curation, limiting the scale and diversity of the training tasks available to study. As an alternative source of few-shot data, we automatically extract 413,299 tasks from diverse internet tables. We provide this as a research resource to investigate the relationship between training data and few-shot learning.
### Source Data
#### Initial Data Collection and Normalization
We use internet tables from the English-language Relational Subset of the WDC Web Table Corpus 2015 (WTC). The WTC dataset tables were extracted from the July 2015 Common Crawl web corpus (http://webdatacommons.org/webtables/2015/EnglishStatistics.html). The dataset contains 50,820,165 tables from 323,160 web domains. We then convert the tables into few-shot learning tasks. Please see our publication for more details on the data collection and conversion pipeline.
#### Who are the source language producers?
The dataset is extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/).
### Annotations
#### Annotation process
Manual annotation was only carried out for the [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low),
[UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium), and [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high) data subsets to rate task quality. Detailed instructions of the annotation instructions can be found in our publication.
#### Who are the annotators?
Annotations were carried out by a lab assistant.
### Personal and Sensitive Information
The data was extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/), which in turn extracted tables from the [Common Crawl](https://commoncrawl.org/). We did not filter the data in any way. Thus any user identities or otherwise sensitive information (e.g., data that reveals racial or ethnic origins, sexual orientations, religious beliefs, political opinions or union memberships, or locations; financial or health data; biometric or genetic data; forms of government identification, such as social security numbers; criminal history, etc.) might be contained in our dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended for use as a research resource to investigate the relationship between training data and few-shot learning. As such, it contains high- and low-quality data, as well as diverse content that may be untruthful or inappropriate. Without careful investigation, it should not be used for training models that will be deployed for use in decision-critical or user-facing situations.
### Discussion of Biases
Since our dataset contains tables that are scraped from the web, it will also contain many toxic, racist, sexist, and otherwise harmful biases and texts. We have not run any analysis on the biases prevalent in our datasets. Neither have we explicitly filtered the content. This implies that a model trained on our dataset may potentially reflect harmful biases and toxic text that exist in our dataset.
### Other Known Limitations
No additional known limitations.
## Additional Information
### Dataset Curators
Jun Shern Chan, Michael Pieler, Jonathan Jao, Jérémy Scheurer, Ethan Perez
### Licensing Information
Apache 2.0
### Citation Information
```
@misc{chan2022few,
author = {Chan, Jun Shern and Pieler, Michael and Jao, Jonathan and Scheurer, Jérémy and Perez, Ethan},
title = {Few-shot Adaptation Works with UnpredicTable Data},
publisher={arXiv},
year = {2022},
url = {https://arxiv.org/abs/2208.01009}
}
```
|
danielpark/mquad-v1 | ---
license: apache-2.0
task_categories:
- question-answering
- text-generation
language:
- en
- ko
tags:
- biology
pretty_name: Medical domain QA dataset for training a medical chatbot.
---
# MQuAD
The Medical Question and Answering dataset(MQuAD) has been refined, including the following datasets. You can download it through the Hugging Face dataset. Use the DATASETS method as follows.
## Quick Guide
```python
from datasets import load_dataset
dataset = load_dataset("danielpark/MQuAD-v1")
```
Medical Q/A datasets gathered from the following websites.
- eHealth Forum
- iCliniq
- Question Doctors
- WebMD
Data was gathered at the 5th of May 2017.
The MQuAD provides embedded question and answer arrays in string format, so it is recommended to convert the string-formatted arrays into float format as follows. This measure has been applied to save resources and time used for embedding.
```python
from datasets import load_dataset
from utilfunction import col_convert
import pandas as pd
qa = load_dataset("danielpark/MQuAD-v1", "csv")
df_qa = pd.DataFrame(qa['train'])
df_qa = col_convert(df_qa, ['Q_FFNN_embeds', 'A_FFNN_embeds'])
```
|
cahya/instructions-test | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 16048
num_examples: 22
download_size: 15127
dataset_size: 16048
---
# Dataset Card for "instructions-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thefcraft/civitai-stable-diffusion-337k | ---
annotations_creators:
- no-annotation
language_creators:
- thefcraft
language:
- en
pretty_name: civitai-stable-diffusion-337k
size_categories:
- 1M<n<10M
source_datasets:
- civitai
---
### How to Use
```
from datasets import load_dataset
dataset = load_dataset("thefcraft/civitai-stable-diffusion-337k")
print(dataset['train'][0])
```
### download images
download zip files from images dir
https://huggingface.co/datasets/thefcraft/civitai-stable-diffusion-337k/tree/main/images
it contains some images with id
```
from zipfile import ZipFile
with ZipFile("filename.zip", 'r') as zObject: zObject.extractall()
```
### Dataset Summary
GitHub URL:- https://github.com/thefcraft/civitai-stable-diffusion-337k
dataset:- civitai-stable-diffusion-337k this dataset contains 337k civitai images url with prompts etc. i use civitai api to get all prompts.
project:- https://github.com/thefcraft/nsfw-prompt-detection-sd I train a model on this dataset
DATA STRUCTURE for othertype/civitai.json:-
```{
'items':[
{'id': 100657,
'url': 'https://imagecache.civitai.com/xG1nkqKTMzGDvpLrqFT7WA/2338276a-87f7-4a1e-f92a-776a18ee4200/width=768/2338276a-87f7-4a1e-f92a-776a18ee4200.jpeg',
'hash': 'U5Exz_00.8D$t89Z%M0100~VD*RktQxaIU~p',
'width': 768,
'height': 1368,
'nsfw': True,
'createdAt': '2023-02-14T10:05:11.498Z',
'postId': 60841,
'stats': {'cryCount': 0,
'laughCount': 0,
'likeCount': 26,
'dislikeCount': 0,
'heartCount': 50,
'commentCount': 4},
'meta': {'ENSD': '31337',
'Size': '512x912',
'seed': 3994946333,
'Model': 'AbyssOrangeMix2_sfw',
'steps': 20,
'prompt': '<lora:hiqcg_body-epoch-000004:0.5>, <lora:hiqcg_face-epoch-000004:0.4>, hiqcgbody, hiqcgface, 1girl, full body, standing, \ndetailed skin texture, detailed cloth texture, beautiful detailed face,\nmasterpiece, best quality, ultra detailed, 8k, intricate details,',
'sampler': 'DPM++ 2M Karras',
'cfgScale': 7,
'Clip skip': '2',
'resources': [{'hash': '038ba203d8',
'name': 'AbyssOrangeMix2_sfw',
'type': 'model'}],
'Model hash': '038ba203d8',
'Hires upscale': '1.5',
'Hires upscaler': 'Latent',
'negativePrompt': 'EasyNegative, extra fingers,fewer fingers, multiple girls, multiple views,',
'Denoising strength': '0.6'},
'username': 'NeoClassicalRibbon'},
{..},
..],
'metadata':{'totalItems': 327145}
}
```
|
anderloh/MotorizedTransportSplit | ---
dataset_info:
- config_name: 2Class
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': Helicopter
'1': Racecar
splits:
- name: train
num_bytes: 620401860.871
num_examples: 2769
- name: test
num_bytes: 351989787.33
num_examples: 1571
download_size: 972417488
dataset_size: 972391648.201
- config_name: 5Class
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': Helicopter
'1': Jet
'2': Racecar
'3': Trains
'4': Truck
splits:
- name: train
num_bytes: 1860497842.936
num_examples: 8304
- name: test
num_bytes: 761781853.0
num_examples: 3400
download_size: 2621469388
dataset_size: 2622279695.936
configs:
- config_name: 2Class
data_files:
- split: train
path: 2Class/train-*
- split: test
path: 2Class/test-*
- config_name: 5Class
data_files:
- split: train
path: 5Class/train-*
- split: test
path: 5Class/test-*
---
# Dataset Card for "MotorizedTransportSplit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/yu_mei_ren_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yu_mei_ren/虞美人/虞美人 (Fate/Grand Order)
This is the dataset of yu_mei_ren/虞美人/虞美人 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `long_hair, brown_hair, breasts, very_long_hair, earrings, medium_breasts, red_eyes, ear_piercing, multiple_earrings, glasses, brown_eyes, braid, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 825.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yu_mei_ren_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 702.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yu_mei_ren_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1256 | 1.31 GiB | [Download](https://huggingface.co/datasets/CyberHarem/yu_mei_ren_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yu_mei_ren_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, black_dress, black_jacket, center_opening, choker, cleavage, collarbone, fur-trimmed_jacket, jewelry, open_jacket, piercing, revealing_clothes, solo, strapless_dress, looking_at_viewer, ribbon-trimmed_dress, simple_background, upper_body, bare_shoulders, hair_between_eyes, long_sleeves, open_mouth, white_background, closed_mouth, navel, sidelocks |
| 1 | 6 |  |  |  |  |  | 1girl, black_dress, center_opening, choker, cleavage, collarbone, jewelry, long_sleeves, looking_at_viewer, navel, revealing_clothes, solo, strapless_dress, black_jacket, fur-trimmed_jacket, open_jacket, piercing, bare_shoulders, closed_mouth, ribbon-trimmed_dress, smile, hair_between_eyes, red_background, simple_background |
| 2 | 14 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, center_opening, choker, revealing_clothes, ribbon-trimmed_dress, solo, strapless_dress, black_gloves, collarbone, elbow_gloves, jewelry, cleavage, looking_at_viewer, navel, piercing, blush, closed_mouth, simple_background, white_background |
| 3 | 10 |  |  |  |  |  | 1girl, black_dress, black_gloves, choker, elbow_gloves, holding_sword, jewelry, looking_at_viewer, piercing, revealing_clothes, solo, strapless_dress, bare_shoulders, center_opening, collarbone, closed_mouth, dual_wielding, navel, simple_background, cleavage, ribbon-trimmed_dress, white_background |
| 4 | 5 |  |  |  |  |  | 1girl, black_dress, jewelry, long_sleeves, piercing, simple_background, solo, white_background, black_gloves, choker, cleavage, collarbone, fur-trimmed_jacket, looking_at_viewer, blush, closed_mouth, black_jacket, center_opening, fur_collar, head_rest, upper_body |
| 5 | 6 |  |  |  |  |  | 1girl, black-framed_eyewear, black_dress, closed_mouth, elbow_gloves, jewelry, piercing, single_braid, solo, black_gloves, looking_at_viewer, choker |
| 6 | 9 |  |  |  |  |  | 1girl, black-framed_eyewear, black_dress, blue_dress, ribbon-trimmed_dress, black_gloves, blush, elbow_gloves, ribbed_dress, single_braid, solo, arm_strap, center_opening, open_mouth, braided_ponytail, jewelry, layered_dress, looking_at_viewer, choker, piercing, strapless_dress, pinstripe_pattern |
| 7 | 5 |  |  |  |  |  | 1girl, bare_shoulders, detached_collar, fake_animal_ears, jewelry, playboy_bunny, rabbit_ears, single_braid, solo, black-framed_eyewear, black_leotard, blush, bowtie, braided_ponytail, cleavage, highleg_leotard, looking_at_viewer, strapless_leotard, open_mouth, simple_background, thighs, white_background, wrist_cuffs, black_footwear, collarbone, covered_navel, fishnet_pantyhose, red_bow |
| 8 | 9 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, looking_at_viewer, navel, collarbone, jewelry, o-ring, solo, choker, single_braid, sling_bikini_top, blush, ocean, outdoors, smile, blue_sky, closed_mouth, day, piercing, beach, cloud, hair_scrunchie, hairclip, sitting, thighs, water |
| 9 | 23 |  |  |  |  |  | jewelry, 1girl, bare_shoulders, china_dress, braided_ponytail, looking_at_viewer, blue_dress, thighs, blush, brown_pantyhose, single_braid, solo, smile, looking_back, pelvic_curtain, tassel, panties, red_trim, black-framed_eyewear, sideboob |
| 10 | 5 |  |  |  |  |  | 1girl, anus, ass, blush, from_behind, jewelry, looking_at_viewer, looking_back, pussy, solo, thighs, arm_strap, braided_ponytail, piercing, single_braid, sweat, barefoot, completely_nude, feet, lying, mosaic_censoring, pillow |
| 11 | 9 |  |  |  |  |  | 1boy, 1girl, blush, hetero, navel, nipples, collarbone, open_mouth, penis, sweat, choker, jewelry, mosaic_censoring, sex, spread_legs, thighs, vaginal, nude, solo_focus, arm_strap, black_gloves, cowgirl_position, cum_in_pussy, elbow_gloves, girl_on_top, on_back |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_dress | black_jacket | center_opening | choker | cleavage | collarbone | fur-trimmed_jacket | jewelry | open_jacket | piercing | revealing_clothes | solo | strapless_dress | looking_at_viewer | ribbon-trimmed_dress | simple_background | upper_body | bare_shoulders | hair_between_eyes | long_sleeves | open_mouth | white_background | closed_mouth | navel | sidelocks | smile | red_background | black_gloves | elbow_gloves | blush | holding_sword | dual_wielding | fur_collar | head_rest | black-framed_eyewear | single_braid | blue_dress | ribbed_dress | arm_strap | braided_ponytail | layered_dress | pinstripe_pattern | detached_collar | fake_animal_ears | playboy_bunny | rabbit_ears | black_leotard | bowtie | highleg_leotard | strapless_leotard | thighs | wrist_cuffs | black_footwear | covered_navel | fishnet_pantyhose | red_bow | o-ring | sling_bikini_top | ocean | outdoors | blue_sky | day | beach | cloud | hair_scrunchie | hairclip | sitting | water | china_dress | brown_pantyhose | looking_back | pelvic_curtain | tassel | panties | red_trim | sideboob | anus | ass | from_behind | pussy | sweat | barefoot | completely_nude | feet | lying | mosaic_censoring | pillow | 1boy | hetero | nipples | penis | sex | spread_legs | vaginal | nude | solo_focus | cowgirl_position | cum_in_pussy | girl_on_top | on_back |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------------|:---------------|:-----------------|:---------|:-----------|:-------------|:---------------------|:----------|:--------------|:-----------|:--------------------|:-------|:------------------|:--------------------|:-----------------------|:--------------------|:-------------|:-----------------|:--------------------|:---------------|:-------------|:-------------------|:---------------|:--------|:------------|:--------|:-----------------|:---------------|:---------------|:--------|:----------------|:----------------|:-------------|:------------|:-----------------------|:---------------|:-------------|:---------------|:------------|:-------------------|:----------------|:--------------------|:------------------|:-------------------|:----------------|:--------------|:----------------|:---------|:------------------|:--------------------|:---------|:--------------|:-----------------|:----------------|:--------------------|:----------|:---------|:-------------------|:--------|:-----------|:-----------|:------|:--------|:--------|:-----------------|:-----------|:----------|:--------|:--------------|:------------------|:---------------|:-----------------|:---------|:----------|:-----------|:-----------|:-------|:------|:--------------|:--------|:--------|:-----------|:------------------|:-------|:--------|:-------------------|:---------|:-------|:---------|:----------|:--------|:------|:--------------|:----------|:-------|:-------------|:-------------------|:---------------|:--------------|:----------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | | | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | X | | X | X | X | X | | X | | X | X | X | X | X | X | X | | X | | | | X | X | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | X | | X | X | X | X | | X | | X | X | X | X | X | X | X | | X | | | | X | X | X | | | | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | | X | | X | | X | X | | | X | | X | X | | | | | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | | | X | | | | X | | X | | X | | X | | | | | | | | | X | | | | | X | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | X | | X | X | | | | X | | X | | X | X | X | X | | | | | | X | | | | | | | X | X | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | | | | X | X | | X | | | | X | | X | | X | | X | | | X | X | | | | | | | | X | | | | | X | X | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 9 |  |  |  |  |  | X | | | | X | X | X | | X | | X | | X | | X | | | | X | | | | | X | X | | X | | | | X | | | | | | X | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 23 |  |  |  |  |  | X | | | | | | | | X | | | | X | | X | | | | X | | | | | | | | X | | | | X | | | | | X | X | X | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 5 |  |  |  |  |  | X | | | | | | | | X | | X | | X | | X | | | | | | | | | | | | | | | | X | | | | | | X | | | X | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 11 | 9 |  |  |  |  |  | X | | | | X | | X | | X | | | | | | | | | | | | | X | | | X | | | | X | X | X | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
mahdibaghbanzadeh/GUE_tf_4 | ---
dataset_info:
features:
- name: sequence
dtype: string
- name: labels
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: train
num_bytes: 2147000
num_examples: 19000
- name: val
num_bytes: 113000
num_examples: 1000
- name: test
num_bytes: 113000
num_examples: 1000
download_size: 1070830
dataset_size: 2373000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
CVasNLPExperiments/DTD_parition1_test_google_flan_t5_xxl_mode_C_T_A_T_SPECIFIC_ns_1880 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_ensemble_specific_rices
num_bytes: 692068
num_examples: 1880
- name: fewshot_1_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_ensemble_specific_rices
num_bytes: 1329559
num_examples: 1880
- name: fewshot_1_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 1330297
num_examples: 1880
download_size: 997973
dataset_size: 3351924
---
# Dataset Card for "DTD_parition1_test_google_flan_t5_xxl_mode_C_T_A_T_SPECIFIC_ns_1880"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vitorbr2009/voz-indio-treinada | ---
license: openrail
---
|
cmeraki/hindi_eval_general_mcq | ---
language:
- hi
license: cc
size_categories:
- 1K<n<10K
task_categories:
- question-answering
dataset_info:
features:
- name: QUESTION
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: TARGET
dtype: string
- name: SUBJECT
dtype: string
- name: GRADE
dtype: string
- name: TOPIC
dtype: string
- name: SPLIT
dtype: string
splits:
- name: train
num_bytes: 881272.5942028986
num_examples: 1653
download_size: 380835
dataset_size: 881272.5942028986
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- maths
- physics
---
|
ShoukanLabs/OpenNiji-380001_415000 | ---
dataset_info:
features:
- name: image
dtype: image
- name: url
dtype: string
- name: prompt
dtype: string
- name: style
dtype: string
splits:
- name: train
num_bytes: 56237791750.702
num_examples: 34999
download_size: 0
dataset_size: 56237791750.702
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "OpenNiji-380001_415000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alpayariyak/something | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 62321431
num_examples: 56037
download_size: 30816818
dataset_size: 62321431
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "something"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yuvalkirstain/PickaPic-ft-pairwise | ---
dataset_info:
features:
- name: good_image
dtype: image
- name: bad_image
dtype: image
- name: text
dtype: string
- name: good_url
dtype: string
- name: bad_url
dtype: string
splits:
- name: train
num_bytes: 8329737112.221633
num_examples: 10313
- name: test
num_bytes: 347307956.7783673
num_examples: 430
download_size: 8675583639
dataset_size: 8677045069.0
---
# Dataset Card for "PickaPic-ft-pairwise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yardeny/processed_bert_context_len_128 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 12001893540.0
num_examples: 15387043
download_size: 4048532411
dataset_size: 12001893540.0
---
# Dataset Card for "processed_bert_context_len_128"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
manot/pothole-segmentation | ---
task_categories:
- object-detection
tags:
- roboflow
- roboflow2huggingface
---
<div align="center">
<img width="640" alt="manot/pothole-segmentation" src="https://huggingface.co/datasets/manot/pothole-segmentation/resolve/main/thumbnail.jpg">
</div>
### Dataset Labels
```
['potholes', 'object', 'pothole', 'potholes']
```
### Number of Images
```json
{'valid': 157, 'test': 80, 'train': 582}
```
### How to Use
- Install [datasets](https://pypi.org/project/datasets/):
```bash
pip install datasets
```
- Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("manot/pothole-segmentation", name="full")
example = ds['train'][0]
```
### Roboflow Dataset Page
[https://universe.roboflow.com/abdulmohsen-fahad-f7pdw/road-damage-xvt2d/dataset/3](https://universe.roboflow.com/abdulmohsen-fahad-f7pdw/road-damage-xvt2d/dataset/3?ref=roboflow2huggingface)
### Citation
```
@misc{ road-damage-xvt2d_dataset,
title = { road damage Dataset },
type = { Open Source Dataset },
author = { abdulmohsen fahad },
howpublished = { \\url{ https://universe.roboflow.com/abdulmohsen-fahad-f7pdw/road-damage-xvt2d } },
url = { https://universe.roboflow.com/abdulmohsen-fahad-f7pdw/road-damage-xvt2d },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2023 },
month = { jun },
note = { visited on 2023-06-13 },
}
```
### License
CC BY 4.0
### Dataset Summary
This dataset was exported via roboflow.com on June 13, 2023 at 8:47 AM GMT
Roboflow is an end-to-end computer vision platform that helps you
* collaborate with your team on computer vision projects
* collect & organize images
* understand and search unstructured image data
* annotate, and create datasets
* export, train, and deploy computer vision models
* use active learning to improve your dataset over time
For state of the art Computer Vision training notebooks you can use with this dataset,
visit https://github.com/roboflow/notebooks
To find over 100k other datasets and pre-trained models, visit https://universe.roboflow.com
The dataset includes 819 images.
Potholes are annotated in COCO format.
The following pre-processing was applied to each image:
* Auto-orientation of pixel data (with EXIF-orientation stripping)
No image augmentation techniques were applied.
|
psroy/mini-platypus-two | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245925
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.