datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
open-llm-leaderboard/details_openchat__openchat_v3.2 | ---
pretty_name: Evaluation run of openchat/openchat_v3.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openchat/openchat_v3.2](https://huggingface.co/openchat/openchat_v3.2) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openchat__openchat_v3.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-19T16:18:30.810728](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_v3.2/blob/main/results_2023-10-19T16-18-30.810728.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n\
\ \"em_stderr\": 0.00037786091964610503,\n \"f1\": 0.06215813758389262,\n\
\ \"f1_stderr\": 0.001356812104243941,\n \"acc\": 0.4530006767701489,\n\
\ \"acc_stderr\": 0.010645807081826102\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964610503,\n\
\ \"f1\": 0.06215813758389262,\n \"f1_stderr\": 0.001356812104243941\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13646702047005307,\n \
\ \"acc_stderr\": 0.00945574199881554\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836664\n\
\ }\n}\n```"
repo_url: https://huggingface.co/openchat/openchat_v3.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|arc:challenge|25_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T09_17_54.525414
path:
- '**/details_harness|drop|3_2023-10-17T09-17-54.525414.parquet'
- split: 2023_10_19T16_18_30.810728
path:
- '**/details_harness|drop|3_2023-10-19T16-18-30.810728.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-19T16-18-30.810728.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T09_17_54.525414
path:
- '**/details_harness|gsm8k|5_2023-10-17T09-17-54.525414.parquet'
- split: 2023_10_19T16_18_30.810728
path:
- '**/details_harness|gsm8k|5_2023-10-19T16-18-30.810728.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-19T16-18-30.810728.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hellaswag|10_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-02T17:42:42.050000.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-02T17:42:42.050000.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-02T17:42:42.050000.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T09_17_54.525414
path:
- '**/details_harness|winogrande|5_2023-10-17T09-17-54.525414.parquet'
- split: 2023_10_19T16_18_30.810728
path:
- '**/details_harness|winogrande|5_2023-10-19T16-18-30.810728.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-19T16-18-30.810728.parquet'
- config_name: results
data_files:
- split: 2023_08_02T17_42_42.050000
path:
- results_2023-08-02T17:42:42.050000.parquet
- split: 2023_10_17T09_17_54.525414
path:
- results_2023-10-17T09-17-54.525414.parquet
- split: 2023_10_19T16_18_30.810728
path:
- results_2023-10-19T16-18-30.810728.parquet
- split: latest
path:
- results_2023-10-19T16-18-30.810728.parquet
---
# Dataset Card for Evaluation run of openchat/openchat_v3.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openchat/openchat_v3.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openchat/openchat_v3.2](https://huggingface.co/openchat/openchat_v3.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openchat__openchat_v3.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T16:18:30.810728](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_v3.2/blob/main/results_2023-10-19T16-18-30.810728.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964610503,
"f1": 0.06215813758389262,
"f1_stderr": 0.001356812104243941,
"acc": 0.4530006767701489,
"acc_stderr": 0.010645807081826102
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964610503,
"f1": 0.06215813758389262,
"f1_stderr": 0.001356812104243941
},
"harness|gsm8k|5": {
"acc": 0.13646702047005307,
"acc_stderr": 0.00945574199881554
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836664
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
graphs-datasets/MNIST | ---
license: mit
task_categories:
- graph-ml
---
# Dataset Card for MNIST
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [External Use](#external-use)
- [PyGeometric](#pygeometric)
- [Dataset Structure](#dataset-structure)
- [Data Properties](#data-properties)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **[Homepage](https://github.com/graphdeeplearning/benchmarking-gnns)**
- **Paper:**: (see citation)
### Dataset Summary
The `MNIST` dataset consists of 55000 images in 10 classes, represented as graphs. It comes from a computer vision dataset.
### Supported Tasks and Leaderboards
`MNIST` should be used for multiclass graph classification.
## External Use
### PyGeometric
To load in PyGeometric, do the following:
```python
from datasets import load_dataset
from torch_geometric.data import Data
from torch_geometric.loader import DataLoader
dataset_hf = load_dataset("graphs-datasets/<mydataset>")
# For the train set (replace by valid or test as needed)
dataset_pg_list = [Data(graph) for graph in dataset_hf["train"]]
dataset_pg = DataLoader(dataset_pg_list)
```
## Dataset Structure
### Data Properties
| property | value |
|---|---|
| #graphs | 55,000 |
| average #nodes | 70.6 |
| average #edges | 564.5 |
### Data Fields
Each row of a given file is a graph, with:
- `node_feat` (list: #nodes x #node-features): nodes
- `edge_index` (list: 2 x #edges): pairs of nodes constituting edges
- `edge_attr` (list: #edges x #edge-features): for the aforementioned edges, contains their features
- `y` (list: #labels): contains the number of labels available to predict
- `num_nodes` (int): number of nodes of the graph
- `pos` (list: 2 x #node): positional information of each node
### Data Splits
This data is split. It comes from the PyGeometric version of the dataset.
## Additional Information
### Licensing Information
The dataset has been released under MIT license.
### Citation Information
```
@article{DBLP:journals/corr/abs-2003-00982,
author = {Vijay Prakash Dwivedi and
Chaitanya K. Joshi and
Thomas Laurent and
Yoshua Bengio and
Xavier Bresson},
title = {Benchmarking Graph Neural Networks},
journal = {CoRR},
volume = {abs/2003.00982},
year = {2020},
url = {https://arxiv.org/abs/2003.00982},
eprinttype = {arXiv},
eprint = {2003.00982},
timestamp = {Sat, 23 Jan 2021 01:14:30 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-2003-00982.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` |
CyberHarem/miyu_edelfelt_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of miyu_edelfelt/美遊・エーデルフェルト/美游·艾德费尔特 (Fate/Grand Order)
This is the dataset of miyu_edelfelt/美遊・エーデルフェルト/美游·艾德费尔特 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `black_hair, hair_ornament, brown_eyes, hairclip, long_hair, breasts, small_breasts, twintails`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 668.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miyu_edelfelt_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 593.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miyu_edelfelt_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1255 | 1.18 GiB | [Download](https://huggingface.co/datasets/CyberHarem/miyu_edelfelt_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/miyu_edelfelt_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, blush, collarbone, navel, nipples, nude, looking_at_viewer, sidelocks, simple_background, solo, white_background, feather_hair_ornament, groin, thighs, x_hair_ornament, closed_mouth, open_mouth, out-of-frame_censoring |
| 1 | 10 |  |  |  |  |  | blush, nipples, open_mouth, sidelocks, thighs, 1boy, hetero, nude, tongue_out, 1girl, ass, collarbone, petite, sex_from_behind, sweat, black_thighhighs, feather_hair_ornament, looking_at_viewer, navel, purple_bikini, solo_focus, heart-shaped_pupils, micro_bikini |
| 2 | 8 |  |  |  |  |  | 1girl, blush, hetero, navel, nipples, open_mouth, sex, vaginal, 1boy, cum_in_pussy, penis, collarbone, loli, spread_legs, girl_on_top, solo_focus, bar_censor, completely_nude, cowgirl_position, glowing_tattoo, pubic_tattoo, thighs, heart-shaped_pupils, looking_at_viewer |
| 3 | 11 |  |  |  |  |  | 1girl, blush, nude, solo, looking_at_viewer, nipples, loli, ass, censored, open_mouth, bondage, looking_back, restrained, anus, from_behind, pussy_juice, thighs, cuffs, indoors, object_insertion, sex_toy |
| 4 | 11 |  |  |  |  |  | 1girl, bare_shoulders, blush, detached_sleeves, looking_at_viewer, magical_girl, purple_leotard, sidelocks, solo, purple_sleeves, x_hair_ornament, purple_thighhighs, thighs, white_cape, feather_hair_ornament, holding, long_sleeves, white_background, boots, covered_navel, simple_background, thigh_strap, wand, closed_mouth, hair_between_eyes |
| 5 | 9 |  |  |  |  |  | 1girl, detached_sleeves, looking_at_viewer, magical_girl, purple_leotard, purple_thighhighs, solo, boots, white_footwear, x_hair_ornament, blush, full_body, wand, white_cape, closed_mouth, long_sleeves, purple_sleeves, holding, ribbon, smile, simple_background, standing |
| 6 | 5 |  |  |  |  |  | 1girl, boots, detached_sleeves, looking_at_viewer, magical_girl, purple_leotard, purple_sleeves, purple_thighhighs, solo, white_footwear, x_hair_ornament, blush, butterfly, smile, ass, bare_shoulders, closed_mouth, long_sleeves, knees_up, sidelocks, sitting, thigh_strap, white_cape |
| 7 | 52 |  |  |  |  |  | cat_ears, looking_at_viewer, bare_shoulders, blush, paw_gloves, 1girl, jingle_bell, animal_ear_fluff, cat_tail, solo, blue_ribbon, black_panties, navel, grey_gloves, thighs, grey_thighhighs, feather_hair_ornament, fake_animal_ears, simple_background, white_background, grey_vest, open_mouth, ass, garter_straps |
| 8 | 17 |  |  |  |  |  | bare_shoulders, looking_at_viewer, blush, navel, day, outdoors, blue_sky, cloud, ocean, beach, collarbone, 1girl, 2girls, ponytail, smile, water, sidelocks, solo_focus, yellow_eyes, flower, frilled_bikini, hair_between_eyes, open_mouth |
| 9 | 9 |  |  |  |  |  | black_thighhighs, homurahara_academy_school_uniform, black_skirt, blush, looking_at_viewer, pleated_skirt, puffy_short_sleeves, white_shirt, 1girl, ass, beret, closed_mouth, neck_ribbon, red_ribbon, white_panties, bag, solo_focus, thighs |
| 10 | 7 |  |  |  |  |  | black_skirt, homurahara_academy_school_uniform, looking_at_viewer, pleated_skirt, 1girl, blush, closed_mouth, solo, white_sailor_collar, white_shirt, neck_ribbon, puffy_short_sleeves, red_ribbon, simple_background, white_background, smile, brown_footwear, full_body, hair_between_eyes, lifted_by_self, shoes, skirt_lift, standing, thighhighs |
| 11 | 11 |  |  |  |  |  | smile, blush, long_sleeves, floral_print, print_kimono, 1girl, closed_mouth, hair_between_eyes, looking_at_viewer, sidelocks, wide_sleeves, obi, blue_kimono, solo_focus, 2girls, holding, open_mouth, outdoors |
| 12 | 7 |  |  |  |  |  | bare_shoulders, fake_animal_ears, looking_at_viewer, playboy_bunny, rabbit_ears, strapless_leotard, wrist_cuffs, 1girl, blush, pantyhose, detached_collar, ponytail, rabbit_tail, highleg_leotard, purple_leotard, solo_focus, white_background, yellow_eyes |
| 13 | 5 |  |  |  |  |  | looking_at_viewer, white_apron, 1girl, black_dress, blush, enmaided, maid_apron, maid_headdress, puffy_short_sleeves, solo, blue_bow, hair_between_eyes, hair_bow, sidelocks, sitting, white_background, zettai_ryouiki, black_thighhighs, butterfly_hair_ornament, cake_slice, dress_lift, frilled_apron, frilled_dress, high_heels, holding_plate, lifted_by_self, navel, open_mouth, simple_background, strawberry |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | collarbone | navel | nipples | nude | looking_at_viewer | sidelocks | simple_background | solo | white_background | feather_hair_ornament | groin | thighs | x_hair_ornament | closed_mouth | open_mouth | out-of-frame_censoring | 1boy | hetero | tongue_out | ass | petite | sex_from_behind | sweat | black_thighhighs | purple_bikini | solo_focus | heart-shaped_pupils | micro_bikini | sex | vaginal | cum_in_pussy | penis | loli | spread_legs | girl_on_top | bar_censor | completely_nude | cowgirl_position | glowing_tattoo | pubic_tattoo | censored | bondage | looking_back | restrained | anus | from_behind | pussy_juice | cuffs | indoors | object_insertion | sex_toy | bare_shoulders | detached_sleeves | magical_girl | purple_leotard | purple_sleeves | purple_thighhighs | white_cape | holding | long_sleeves | boots | covered_navel | thigh_strap | wand | hair_between_eyes | white_footwear | full_body | ribbon | smile | standing | butterfly | knees_up | sitting | cat_ears | paw_gloves | jingle_bell | animal_ear_fluff | cat_tail | blue_ribbon | black_panties | grey_gloves | grey_thighhighs | fake_animal_ears | grey_vest | garter_straps | day | outdoors | blue_sky | cloud | ocean | beach | 2girls | ponytail | water | yellow_eyes | flower | frilled_bikini | homurahara_academy_school_uniform | black_skirt | pleated_skirt | puffy_short_sleeves | white_shirt | beret | neck_ribbon | red_ribbon | white_panties | bag | white_sailor_collar | brown_footwear | lifted_by_self | shoes | skirt_lift | thighhighs | floral_print | print_kimono | wide_sleeves | obi | blue_kimono | playboy_bunny | rabbit_ears | strapless_leotard | wrist_cuffs | pantyhose | detached_collar | rabbit_tail | highleg_leotard | white_apron | black_dress | enmaided | maid_apron | maid_headdress | blue_bow | hair_bow | zettai_ryouiki | butterfly_hair_ornament | cake_slice | dress_lift | frilled_apron | frilled_dress | high_heels | holding_plate | strawberry |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:-------------|:--------|:----------|:-------|:--------------------|:------------|:--------------------|:-------|:-------------------|:------------------------|:--------|:---------|:------------------|:---------------|:-------------|:-------------------------|:-------|:---------|:-------------|:------|:---------|:------------------|:--------|:-------------------|:----------------|:-------------|:----------------------|:---------------|:------|:----------|:---------------|:--------|:-------|:--------------|:--------------|:-------------|:------------------|:-------------------|:-----------------|:---------------|:-----------|:----------|:---------------|:-------------|:-------|:--------------|:--------------|:--------|:----------|:-------------------|:----------|:-----------------|:-------------------|:---------------|:-----------------|:-----------------|:--------------------|:-------------|:----------|:---------------|:--------|:----------------|:--------------|:-------|:--------------------|:-----------------|:------------|:---------|:--------|:-----------|:------------|:-----------|:----------|:-----------|:-------------|:--------------|:-------------------|:-----------|:--------------|:----------------|:--------------|:------------------|:-------------------|:------------|:----------------|:------|:-----------|:-----------|:--------|:--------|:--------|:---------|:-----------|:--------|:--------------|:---------|:-----------------|:------------------------------------|:--------------|:----------------|:----------------------|:--------------|:--------|:--------------|:-------------|:----------------|:------|:----------------------|:-----------------|:-----------------|:--------|:-------------|:-------------|:---------------|:---------------|:---------------|:------|:--------------|:----------------|:--------------|:--------------------|:--------------|:------------|:------------------|:--------------|:------------------|:--------------|:--------------|:-----------|:-------------|:-----------------|:-----------|:-----------|:-----------------|:--------------------------|:-------------|:-------------|:----------------|:----------------|:-------------|:----------------|:-------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | X | | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | X | X | | X | | | | | | | X | | | X | | X | X | | | | | | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | X | | | X | X | X | | | X | | | | X | | | X | | | | | X | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 11 |  |  |  |  |  | X | X | | | | | X | X | X | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | X | | | | | X | | X | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | | | | | X | X | | X | | | | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | X | X | | X | | | X | | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 52 |  |  |  |  |  | X | X | | X | | | X | | X | X | X | X | | X | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 17 |  |  |  |  |  | X | X | X | X | | | X | X | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 9 |  |  |  |  |  | X | X | | | | | X | | | | | | | X | | X | | | | | | X | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 7 |  |  |  |  |  | X | X | | | | | X | | X | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | X | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 11 |  |  |  |  |  | X | X | | | | | X | X | | | | | | | | X | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | X | | | | X | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 12 | 7 |  |  |  |  |  | X | X | | | | | X | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 13 | 5 |  |  |  |  |  | X | X | | X | | | X | X | X | X | X | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
bigscience-data/roots_en_no_code_stackexchange | ---
language: en
license: cc-by-sa-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_en_no_code_stackexchange
# Stack Exchange Website
- Dataset uid: `no_code_stackexchange`
### Description
Launched in 2010, the Stack Exchange network comprises 173 Q&A communities including Stack Overflow, the largest, most trusted online community for developers to learn, share their knowledge, and build their careers.
### Homepage
https://stackexchange.com/
### Licensing
- open license
- cc-by-sa-4.0: Creative Commons Attribution Share Alike 4.0 International
Subscriber Content
You agree that any and all content, including without limitation any and all text, graphics, logos, tools, photographs, images, illustrations, software or source code, audio and video, animations, and product feedback (collectively, “Content”) that you provide to the public Network (collectively, “Subscriber Content”), is perpetually and irrevocably licensed to Stack Overflow on a worldwide, royalty-free, non-exclusive basis pursuant to Creative Commons licensing terms (CC BY-SA 4.0), and you grant Stack Overflow the perpetual and irrevocable right and license to access, use, process, copy, distribute, export, display and to commercially exploit such Subscriber Content, even if such Subscriber Content has been contributed and subsequently removed by you as reasonably necessary to, for example (without limitation):
Provide, maintain, and update the public Network
Process lawful requests from law enforcement agencies and government agencies
Prevent and address security incidents and data security features, support features, and to provide technical assistance as it may be required
Aggregate data to provide product optimization
This means that you cannot revoke permission for Stack Overflow to publish, distribute, store and use such content and to allow others to have derivative rights to publish, distribute, store and use such content. The CC BY-SA 4.0 license terms are explained in further detail by Creative Commons, and the license terms applicable to content are explained in further detail here. You should be aware that all Public Content you contribute is available for public copy and redistribution, and all such Public Content must have appropriate attribution.
As stated above, by agreeing to these Public Network Terms you also agree to be bound by the terms and conditions of the Acceptable Use Policy incorporated herein, and hereby acknowledge and agree that any and all Public Content you provide to the public Network is governed by the Acceptable Use Policy.
### Speaker Locations
- Northern America
### Sizes
- 0.5414 % of total
- 2.9334 % of en
### BigScience processing steps
#### Filters applied to: en
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
|
goodfellowliu/Set14 | ---
license: openrail
---
|
Vanzill/lol | ---
license: cc
---
|
mask-distilled-one-sec-cv12/chunk_166 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1164112672
num_examples: 228616
download_size: 1189684678
dataset_size: 1164112672
---
# Dataset Card for "chunk_166"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DylanJHJ/cqg4is | ---
license: apache-2.0
---
|
davidfant/natural-questions-chunk-8 | ---
dataset_info:
features:
- name: id
dtype: string
- name: document
struct:
- name: html
dtype: string
- name: title
dtype: string
- name: tokens
sequence:
- name: end_byte
dtype: int64
- name: is_html
dtype: bool
- name: start_byte
dtype: int64
- name: token
dtype: string
- name: url
dtype: string
- name: question
struct:
- name: text
dtype: string
- name: tokens
sequence: string
- name: long_answer_candidates
sequence:
- name: end_byte
dtype: int64
- name: end_token
dtype: int64
- name: start_byte
dtype: int64
- name: start_token
dtype: int64
- name: top_level
dtype: bool
- name: annotations
sequence:
- name: id
dtype: string
- name: long_answer
struct:
- name: candidate_index
dtype: int64
- name: end_byte
dtype: int64
- name: end_token
dtype: int64
- name: start_byte
dtype: int64
- name: start_token
dtype: int64
- name: short_answers
sequence:
- name: end_byte
dtype: int64
- name: end_token
dtype: int64
- name: start_byte
dtype: int64
- name: start_token
dtype: int64
- name: text
dtype: string
- name: yes_no_answer
dtype:
class_label:
names:
'0': 'NO'
'1': 'YES'
splits:
- name: train
num_bytes: 4690331518
num_examples: 10000
download_size: 1821291244
dataset_size: 4690331518
---
# Dataset Card for "natural-questions-chunk-8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
stodoran/elwha-segmentation-tiny | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 199639536.0
num_examples: 198
- name: validation
num_bytes: 22848973.0
num_examples: 22
download_size: 222456047
dataset_size: 222488509.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
patruff/chucklesG1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 370644
num_examples: 1840
- name: test
num_bytes: 92483
num_examples: 460
download_size: 84476
dataset_size: 463127
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/akebono_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of akebono/曙/曙 (Kantai Collection)
This is the dataset of akebono/曙/曙 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `purple_hair, long_hair, side_ponytail, hair_ornament, purple_eyes, hair_flower, very_long_hair, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 555.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akebono_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 337.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akebono_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1203 | 726.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akebono_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 500.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akebono_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1203 | 996.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akebono_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/akebono_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, flower, hair_bell, jingle_bell, serafuku, solo, upper_body, looking_at_viewer, simple_background, white_background, blue_sailor_collar, blush, short_sleeves, twitter_username |
| 1 | 6 |  |  |  |  |  | 1girl, flower, hair_bell, jingle_bell, open_mouth, serafuku, solo, machinery, turret, pleated_skirt, short_sleeves, cannon, looking_at_viewer |
| 2 | 8 |  |  |  |  |  | 1girl, blush, flower, hair_bell, jingle_bell, serafuku, solo, valentine, gift_box, heart-shaped_box, looking_at_viewer, sweater, pleated_skirt, apron, chocolate, long_sleeves, open_mouth, sitting, socks |
| 3 | 10 |  |  |  |  |  | 1girl, flower, hair_bell, jingle_bell, solo, looking_at_viewer, navel, pink_bikini, blush, collarbone, flat_chest, simple_background, bikini_skirt, white_background, cowboy_shot, scrunchie |
| 4 | 10 |  |  |  |  |  | 1girl, flower, hair_bell, jingle_bell, solo, looking_at_viewer, apron, blush, tasuki, short_kimono, floral_print, obi, wa_maid, broom, open_mouth, simple_background, white_background |
| 5 | 5 |  |  |  |  |  | 1girl, flower, hair_bell, jingle_bell, nude, open_mouth, small_breasts, solo, blush, nipples, pussy, looking_at_viewer, navel, ass, censored, lying, pillow, simple_background, socks |
| 6 | 10 |  |  |  |  |  | blush, flower, hair_bell, jingle_bell, looking_at_viewer, navel, underwear_only, 1girl, collarbone, pink_bra, pink_panties, solo, bow_panties, armpits, bare_shoulders, groin, on_back, small_breasts, arms_up, barefoot, bed_sheet, bow_bra, dakimakura_(medium) |
| 7 | 5 |  |  |  |  |  | 1girl, blush, christmas, full_body, simple_background, solo, white_background, black_thighhighs, flower, holding, ahoge, dress, long_sleeves, open_mouth, sack, santa_costume, torn_thighhighs, torpedo, capelet, closed_mouth, jingle_bell, one_eye_closed, stuffed_toy |
| 8 | 5 |  |  |  |  |  | 1girl, blush, cat_cutout, cat_ear_panties, cat_lingerie, cleavage_cutout, flower, hair_bell, jingle_bell, navel, solo, underwear_only, choker, collarbone, side-tie_panties, cat_ears, cowboy_shot, flat_chest, looking_at_viewer, black_bra, black_panties, frilled_bra, neck_bell, open_mouth, simple_background, small_breasts, thighhighs, white_background, white_bra, white_panties |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | flower | hair_bell | jingle_bell | serafuku | solo | upper_body | looking_at_viewer | simple_background | white_background | blue_sailor_collar | blush | short_sleeves | twitter_username | open_mouth | machinery | turret | pleated_skirt | cannon | valentine | gift_box | heart-shaped_box | sweater | apron | chocolate | long_sleeves | sitting | socks | navel | pink_bikini | collarbone | flat_chest | bikini_skirt | cowboy_shot | scrunchie | tasuki | short_kimono | floral_print | obi | wa_maid | broom | nude | small_breasts | nipples | pussy | ass | censored | lying | pillow | underwear_only | pink_bra | pink_panties | bow_panties | armpits | bare_shoulders | groin | on_back | arms_up | barefoot | bed_sheet | bow_bra | dakimakura_(medium) | christmas | full_body | black_thighhighs | holding | ahoge | dress | sack | santa_costume | torn_thighhighs | torpedo | capelet | closed_mouth | one_eye_closed | stuffed_toy | cat_cutout | cat_ear_panties | cat_lingerie | cleavage_cutout | choker | side-tie_panties | cat_ears | black_bra | black_panties | frilled_bra | neck_bell | thighhighs | white_bra | white_panties |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:------------|:--------------|:-----------|:-------|:-------------|:--------------------|:--------------------|:-------------------|:---------------------|:--------|:----------------|:-------------------|:-------------|:------------|:---------|:----------------|:---------|:------------|:-----------|:-------------------|:----------|:--------|:------------|:---------------|:----------|:--------|:--------|:--------------|:-------------|:-------------|:---------------|:--------------|:------------|:---------|:---------------|:---------------|:------|:----------|:--------|:-------|:----------------|:----------|:--------|:------|:-----------|:--------|:---------|:-----------------|:-----------|:---------------|:--------------|:----------|:-----------------|:--------|:----------|:----------|:-----------|:------------|:----------|:----------------------|:------------|:------------|:-------------------|:----------|:--------|:--------|:-------|:----------------|:------------------|:----------|:----------|:---------------|:-----------------|:--------------|:-------------|:------------------|:---------------|:------------------|:---------|:-------------------|:-----------|:------------|:----------------|:--------------|:------------|:-------------|:------------|:----------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | | X | | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | X | X | X | | X | | | | X | | | X | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | X | X | X | | X | | X | X | X | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | X | X | X | | X | | X | X | X | | X | | | X | | | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | X | | X | | X | X | | | X | | | X | | | | | | | | | | | | | X | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 10 |  |  |  |  |  | X | X | X | X | | X | | X | | | | X | | | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | | X | | X | | | X | X | | X | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 8 | 5 |  |  |  |  |  | X | X | X | X | | X | | X | X | X | | X | | | X | | | | | | | | | | | | | | X | | X | X | | X | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
daveokpare/glaive-function-calling-v2-chatml | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 240513536
num_examples: 101664
- name: test
num_bytes: 26759126
num_examples: 11296
download_size: 102708419
dataset_size: 267272662
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
andersonbcdefg/minipile_val_tokenized | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: targets
sequence: int64
splits:
- name: validation
num_bytes: 8317504
num_examples: 1352
download_size: 2873910
dataset_size: 8317504
---
# Dataset Card for "minipile_val_tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LeoLM/ArcChallenge_de | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
struct:
- name: text
sequence: string
- name: label
sequence: string
- name: answerKey
dtype: string
- name: question_de
dtype: string
- name: choices_de
struct:
- name: label
sequence: string
- name: text
sequence: string
- name: translation_de
dtype: string
splits:
- name: test
num_bytes: 1170655
num_examples: 1172
- name: validation
num_bytes: 301790
num_examples: 299
download_size: 807450
dataset_size: 1472445
---
# Dataset Card for "arc_challenge_de"
|
huggingartists/kishlak | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/kishlak"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.12921 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/c0c7e74ec794ad44eb0957d6afdd383d.815x815x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/kishlak">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Кишлак (Kishlak)</div>
<a href="https://genius.com/artists/kishlak">
<div style="text-align: center; font-size: 14px;">@kishlak</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/kishlak).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/kishlak")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|43| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/kishlak")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
nmarafo/truthful_qa_TrueFalse_Feedback | ---
license: apache-2.0
task_categories:
- table-question-answering
language:
- en
---
# Dataset Card for Dataset Name
This is a reduced variation of the truthful_qa dataset (https://huggingface.co/datasets/truthful_qa), modified to associate boolean values with the given answers, with a correct answer as a reference, and a feedback.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
TruthfulQA:
@misc{lin2021truthfulqa,
title={TruthfulQA: Measuring How Models Mimic Human Falsehoods},
author={Stephanie Lin and Jacob Hilton and Owain Evans},
year={2021},
eprint={2109.07958},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
nk2201/English-to-Hinglish | ---
license: mit
language:
- en
tags:
- translation
pretty_name: json
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/inazuma_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of inazuma/電/电 (Azur Lane)
This is the dataset of inazuma/電/电 (Azur Lane), containing 48 images and their tags.
The core tags of this character are `blue_eyes, blue_hair, horns, long_hair, oni_horns, hair_ornament, breasts, ahoge, bangs, medium_breasts, hair_between_eyes, ponytail, sidelocks, ribbon, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 48 | 55.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/inazuma_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 48 | 36.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/inazuma_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 110 | 73.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/inazuma_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 48 | 50.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/inazuma_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 110 | 95.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/inazuma_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/inazuma_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | 1girl, looking_at_viewer, bare_shoulders, solo, blush, kimono, wide_sleeves, choker, cleavage, black_thighhighs, obi, simple_background, umbrella, white_background |
| 1 | 6 |  |  |  |  |  | 1girl, bare_shoulders, detached_sleeves, hair_flower, looking_at_viewer, solo, frills, hairband, mini_top_hat, skirt, bow, large_breasts, simple_background, striped, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | bare_shoulders | solo | blush | kimono | wide_sleeves | choker | cleavage | black_thighhighs | obi | simple_background | umbrella | white_background | detached_sleeves | hair_flower | frills | hairband | mini_top_hat | skirt | bow | large_breasts | striped |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-----------------|:-------|:--------|:---------|:---------------|:---------|:-----------|:-------------------|:------|:--------------------|:-----------|:-------------------|:-------------------|:--------------|:---------|:-----------|:---------------|:--------|:------|:----------------|:----------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X |
|
dmayhem93/summarization-sft-heirarchical-valid2 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: 125M
dtype: string
- name: 1B
dtype: string
- name: 6B
dtype: string
- name: 20B
dtype: string
splits:
- name: train
num_bytes: 131656080
num_examples: 50720
download_size: 38150074
dataset_size: 131656080
---
# Dataset Card for "summarization-sft-heirarchical-valid2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MiniJake/Record | ---
license: unknown
---
|
PlanTL-GOB-ES/sts-es | ---
YAML tags:
annotations_creators:
- expert-generated
language:
- es
language_creators:
- found
multilinguality:
- monolingual
pretty_name: STS-es
size_categories: []
source_datasets: []
tags: []
task_categories:
- text-classification
task_ids:
- semantic-similarity-scoring
- text-scoring
---
# STS-es
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://alt.qcri.org/semeval2014/task10/
- **Point of Contact:** [Aitor Gonzalez](aitor.gonzalez@bsc.es)
### Dataset Summary
For Semantic Text Similarity, we collected the Spanish test sets from SemEval-2014 (Agirre et al., 2014) and SemEval-2015 (Agirre et al., 2015). Since no training data was provided for the Spanish subtask, we randomly sampled both datasets into 1,321 sentences for the train set, 78 sentences for the development set, and 156 sentences for the test set. To make the task harder for the models, we purposely made the development set smaller than the test set.
We use this corpus as part of the EvalEs Spanish language benchmark.
### Supported Tasks and Leaderboards
Semantic Text Similarity Scoring
### Languages
The dataset is in Spanish (`es-ES`)
## Dataset Structure
### Data Instances
```
{
'sentence1': "El "tendón de Aquiles" ("tendo Achillis") o "tendón calcáneo" ("tendo calcaneus") es un tendón de la parte posterior de la pierna."
'sentence2': "El tendón de Aquiles es la extensión tendinosa de los tres músculos de la pantorrilla: gemelo, sóleo y plantar delgado."
'label': 2.8
}
```
### Data Fields
- sentence1: String
- sentence2: String
- label: Float
### Data Splits
- train: 1,321 instances
- dev: 78 instances
- test: 156 instances
## Dataset Creation
### Curation Rationale
[N/A]
### Source Data
The source data came from the Spanish Wikipedia (2013 dump) and texts from Spanish news (2014).
For more information visit the paper from the SemEval-2014 Shared Task [(Agirre et al., 2014)](https://aclanthology.org/S14-2010.pdf) and the SemEval-2015 Shared Task [(Agirre et al., 2015)](https://aclanthology.org/S15-2045.pdf).
#### Initial Data Collection and Normalization
For more information visit the paper from the SemEval-2014 Shared Task [(Agirre et al., 2014)](https://aclanthology.org/S14-2010.pdf) and the SemEval-2015 Shared Task [(Agirre et al., 2015)](https://aclanthology.org/S15-2045.pdf).
#### Who are the source language producers?
Journalists and Wikipedia contributors.
### Annotations
#### Annotation process
For more information visit the paper from the SemEval-2014 Shared Task [(Agirre et al., 2014)](https://aclanthology.org/S14-2010.pdf) and the SemEval-2015 Shared Task [(Agirre et al., 2015)](https://aclanthology.org/S15-2045.pdf).
#### Who are the annotators?
For more information visit the paper from the SemEval-2014 Shared Task [(Agirre et al., 2014)](https://aclanthology.org/S14-2010.pdf) and the SemEval-2015 Shared Task [(Agirre et al., 2015)](https://aclanthology.org/S15-2045.pdf).
### Personal and Sensitive Information
No personal or sensitive information included.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset contributes to the development of language models in Spanish.
### Discussion of Biases
No postprocessing steps were applied to mitigate potential social biases.
## Additional Information
### Citation Information
The following papers must be cited when using this corpus:
```
@inproceedings{agirre2015semeval,
title={Semeval-2015 task 2: Semantic textual similarity, english, spanish and pilot on interpretability},
author={Agirre, Eneko and Banea, Carmen and Cardie, Claire and Cer, Daniel and Diab, Mona and Gonzalez-Agirre, Aitor and Guo, Weiwei and Lopez-Gazpio, Inigo and Maritxalar, Montse and Mihalcea, Rada and others},
booktitle={Proceedings of the 9th international workshop on semantic evaluation (SemEval 2015)},
pages={252--263},
year={2015}
}
@inproceedings{agirre2014semeval,
title={SemEval-2014 Task 10: Multilingual Semantic Textual Similarity.},
author={Agirre, Eneko and Banea, Carmen and Cardie, Claire and Cer, Daniel M and Diab, Mona T and Gonzalez-Agirre, Aitor and Guo, Weiwei and Mihalcea, Rada and Rigau, German and Wiebe, Janyce},
booktitle={SemEval@ COLING},
pages={81--91},
year={2014}
}
```
|
skrishna/allenai-real-toxicity-prompts_160M_non_toxic | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 16854
num_examples: 100
- name: test
num_bytes: 7908
num_examples: 50
download_size: 22700
dataset_size: 24762
---
# Dataset Card for "allenai-real-toxicity-prompts_160M_non_toxic"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/lalum_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of lalum (Fire Emblem)
This is the dataset of lalum (Fire Emblem), containing 38 images and their tags.
The core tags of this character are `hair_bun, double_bun, green_eyes, orange_hair, breasts, ribbon, short_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 38 | 35.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lalum_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 38 | 23.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lalum_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 83 | 43.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lalum_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 38 | 32.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lalum_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 83 | 55.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lalum_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lalum_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, hetero, penis, sex, solo_focus, vaginal, sweat, 1boy, nipples, blush, cum_in_pussy, medium_breasts, girl_on_top, hair_ribbon, mosaic_censoring, nude, straddling, tears |
| 1 | 24 |  |  |  |  |  | 1girl, navel, solo, midriff, smile, open_mouth, jewelry, white_background, blush, simple_background, dancer, hair_ornament, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hetero | penis | sex | solo_focus | vaginal | sweat | 1boy | nipples | blush | cum_in_pussy | medium_breasts | girl_on_top | hair_ribbon | mosaic_censoring | nude | straddling | tears | navel | solo | midriff | smile | open_mouth | jewelry | white_background | simple_background | dancer | hair_ornament | looking_at_viewer |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:--------|:------|:-------------|:----------|:--------|:-------|:----------|:--------|:---------------|:-----------------|:--------------|:--------------|:-------------------|:-------|:-------------|:--------|:--------|:-------|:----------|:--------|:-------------|:----------|:-------------------|:--------------------|:---------|:----------------|:--------------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 24 |  |  |  |  |  | X | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
Tristan/olm-test-normal-dedup | ---
dataset_info:
features:
- name: text
dtype: string
- name: url
dtype: string
- name: crawl_timestamp
dtype: float64
splits:
- name: train
num_bytes: 211642596.0
num_examples: 40900
download_size: 128804894
dataset_size: 211642596.0
---
# Dataset Card for "olm-test-normal-dedup"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
csupiisc/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 6656
num_examples: 8
download_size: 6982
dataset_size: 6656
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-marketing-neg-prepend-fix | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6193
num_examples: 5
- name: test
num_bytes: 632204
num_examples: 234
download_size: 14819
dataset_size: 638397
---
# Dataset Card for "mmlu-marketing-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_cola_comparative_than | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 156
num_examples: 2
- name: test
num_bytes: 71
num_examples: 1
- name: train
num_bytes: 2115
num_examples: 27
download_size: 6857
dataset_size: 2342
---
# Dataset Card for "MULTI_VALUE_cola_comparative_than"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cmu-mlsp/encodec_24khz-librispeech_asr-test.clean-features | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 24000
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
- name: id
dtype: string
- name: audio_codes
sequence:
sequence: int64
splits:
- name: test.clean
num_bytes: 958024726.0
num_examples: 2620
download_size: 918826540
dataset_size: 958024726.0
configs:
- config_name: default
data_files:
- split: test.clean
path: data/test.clean-*
---
# Dataset Card for "encodec_24khz-librispeech_asr-test.clean-features"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xx18/R2PE | ---
license: mit
task_categories:
- text-classification
language:
- en
configs:
- config_name: GSM8K
data_files:
- split: gpt3
path: data/gsm8k/text-davinci-003/test.jsonl
- split: gpt3.5
path: data/gsm8k/gpt-3.5-turbo-1106/test.jsonl
- split: gpt_instruct
path: data/gsm8k/gpt-3.5-turbo-instruct/test.jsonl
- split: gemini_pro
path: data/gsm8k/gemini-pro/test.jsonl
- split: mixtral_8x7b
path: data/gsm8k/mixtral-8x7b/test.jsonl
- split: mistral_medium
path: data/gsm8k/mistral-medium/test.jsonl
- config_name: MATH
data_files:
- split: gpt3
path: data/math/text-davinci-003/test.jsonl
- split: gpt3.5
path: data/math/gpt-3.5-turbo-1106/test.jsonl
- split: gpt_instruct
path: data/math/gpt-3.5-turbo-instruct/test.jsonl
- split: gemini_pro
path: data/math/gemini-pro/test.jsonl
- split: mixtral_8x7b
path: data/math/mixtral-8x7b/test.jsonl
- split: mistral_medium
path: data/math/mistral-medium/test.jsonl
- config_name: StrategyQA
data_files:
- split: gpt3
path: data/StrategyQA/text-davinci-003/test.jsonl
- split: gpt3.5
path: data/StrategyQA/gpt-3.5-turbo-1106/test.jsonl
- split: gpt_instruct
path: data/StrategyQA/gpt-3.5-turbo-instruct/test.jsonl
- split: gemini_pro
path: data/StrategyQA/gemini-pro/test.jsonl
- split: mixtral_8x7b
path: data/StrategyQA/mixtral-8x7b/test.jsonl
- split: mistral_medium
path: data/StrategyQA/mistral-medium/test.jsonl
- config_name: Play
data_files:
- split: gpt3
path: data/play/text-davinci-003/test.jsonl
- split: gpt3.5
path: data/play/gpt-3.5-turbo-1106/test.jsonl
- split: gpt_instruct
path: data/play/gpt-3.5-turbo-instruct/test.jsonl
- split: gemini_pro
path: data/play/gemini-pro/test.jsonl
- split: mixtral_8x7b
path: data/play/mixtral-8x7b/test.jsonl
- split: mistral_medium
path: data/play/mistral-medium/test.jsonl
- config_name: Physics
data_files:
- split: gpt3
path: data/physics/text-davinci-003/test.jsonl
- split: gpt3.5
path: data/physics/gpt-3.5-turbo-1106/test.jsonl
- split: gpt_instruct
path: data/physics/gpt-3.5-turbo-instruct/test.jsonl
- split: gemini_pro
path: data/physics/gemini-pro/test.jsonl
- split: mixtral_8x7b
path: data/physics/mixtral-8x7b/test.jsonl
- split: mistral_medium
path: data/physics/mistral-medium/test.jsonl
- config_name: FEVER
data_files:
- split: gpt3
path: data/Fever/text-davinci-003/test.jsonl
- split: gpt3.5
path: data/Fever/gpt-3.5-turbo-1106/test.jsonl
- split: gpt_instruct
path: data/Fever/gpt-3.5-turbo-instruct/test.jsonl
- split: gemini_pro
path: data/Fever/gemini-pro/test.jsonl
- split: mixtral_8x7b
path: data/Fever/mixtral-8x7b/test.jsonl
- config_name: HotpotQA
data_files:
- split: gpt3
path: data/HotpotQA/text-davinci-003/test.jsonl
- split: gpt4
path: data/HotpotQA/gpt-4-0314/test.jsonl
- split: gpt_instruct
path: data/HotpotQA/gpt-3.5-turbo-instruct/test.jsonl
- split: gemini_pro
path: data/HotpotQA/gemini-pro/test.jsonl
- split: mixtral_8x7b
path: data/HotpotQA/mixtral-8x7b/test.jsonl
- config_name: 2WikiMultihop
data_files:
- split: gpt3
path: data/2WikiMultihop/text-davinci-003/test.jsonl
- split: gpt4
path: data/2WikiMultihop/gpt-4-0314/test.jsonl
- split: gpt_instruct
path: data/2WikiMultihop/gpt-3.5-turbo-instruct/test.jsonl
- split: gemini_pro
path: data/2WikiMultihop/gemini-pro/test.jsonl
- split: mixtral_8x7b
path: data/2WikiMultihop/mixtral-8x7b/test.jsonl
pretty_name: R2PE
size_categories:
- 10K<n<100K
---
# Dataset Card for R2PE Benchmark
- GitHub repository: https://github.com/XinXU-USTC/R2PE
- Paper: [Can We Verify Step by Step for Incorrect Answer Detection?](https://arxiv.org/abs/2402.10528)
## Dataset Summary
- This is R2PE (Relation of Rationales and Performance Evaluation) Benchmark.
- The aim is to explore the connection between the quality of reasoning chains and end-task performance.
- We use CoT-SC to collect responses from 8 reasoning tasks spanning from 5 domains with various answer formats using 6 different LLMs.
| Dataset | Task Type | Answer Format | Domain |
|--------------|------------------------|-----------------|-----------------|
| GSM8K | Mathematical Reasoning | Numeric | Mathematics |
| MATH | Mathematical Reasoning | Numeric | Mathematics |
| StrategyQA | Common Sense Reasoning | Yes/No | Commonsense |
| play | Common Sense Reasoning | Yes/No | Literature |
| physics | Physical Reasoning | Multiple Choice | Physics |
| FEVER | Fact Verification | Yes/No | World Knowledge |
| HotpotQA | Open-Domain QA | Free Form | World Knowledge |
| 2WikiMultihop| Open-Domain QA | Free Form | World Knowledge |
## Dataset Structure
### Data Fields
| Field Name | Value | Description |
| ----------- | ----------- | ------------------------------------------- |
| question | string | The question or claim used to query LLM from the original dataset. |
| id | string or int | id of 'question' in the original dataset
| dataset | string | Which dataset Q is from? (FEVER, HotpotQA, or 2WikiMultihop) |
| llm | string | LLM used to query. |
| responses | list | A list of five responses generated by 'llm name' for a 'question' from the 'dataset'. Each response contains a rationale and an answer |
| rationales | list | A list of rationales segmented from 'responses'. |
| answers | list | A list of answers segmented from 'responses'. |
| output | string | The final answer selected from 'answers' by majority voting. |
| ground-truth | list or string | The ground-truth answer or answer list provided for 'question' from the 'dataset'. |
| label | Boolean | {True, False} to indicate whether 'output' matches the 'ground-truth'. |
### Data Instances
An example looks as follows:
```python
{'question': 'Which film was released earlier, Navavadhu or The January Man?',
'id': '5effec28087111ebbd63ac1f6bf848b6'
'dataset': '2WikiMultihop',
'llm': 'text-davinci-003'
'repsonses': ["First, Navavadhu was released on 15 February 2019. Second, The January Man was released on 17 February 1989. The answer is The January Man.",
"First, film Navavadhu was released on 17 August 1979. Second, The January Man was released on 24 August 1989. The answer is Navavadhu.",
"First, film Navavadhu was released on 8 April 1988. Second, The January Man was released on 11 August 1989. The answer is Navavadhu.",
"First, film Navavadhu was released on 21 August 1992. Second, The January Man was released on 11 August 1989. The answer is The January Man.",
"First, film Navavadhu was released on 15 February 2019. Second, The January Man was released on 10 February 1989. The answer is The January Man."],
'rationales': ["First, Navavadhu was released on 15 February 2019. Second, The January Man was released on 17 February 1989.",
"First, film Navavadhu was released on 17 August 1979. Second, The January Man was released on 24 August 1989.",
"First, film Navavadhu was released on 8 April 1988. Second, The January Man was released on 11 August 1989.",
"First, film Navavadhu was released on 21 August 1992. Second, The January Man was released on 11 August 1989.",
"First, film Navavadhu was released on 15 February 2019. Second, The January Man was released on 10 February 1989."],
'answers': ["The January Man", "Navavadhu", "Navavadhu", "The January Man", "The January Man"],
'output': "The January Man",
'ground-truth': 'Navavadhu',
'label': False}
```
The statistics for R2PE are as follows.
| Dataset | Method | GPT3 | GPT-instruct | GPT-3.5 | Gemini | Mixtral | mistral |
|--------------- |------------|------|--------------|---------|--------|---------|---------|
| GSM8K | FALSE | 510 | 300 | 326 | 246 | 389 | 225 |
| | total | 1319 | 1319 | 1250 | 1319 | 1278 | 1313 |
| MATH | FALSE | 827 | 674 | 380 | 697 | 737 | 719 |
| | total | 998 | 1000 | 1000 | 1000 | 999 | 1000 |
| StrategyQA | FALSE | 490 | 368 | 399 | 445 | 553 | 479 |
| | total | 1000 | 1000 | 1000 | 988 | 1000 | 1000 |
| Play | FALSE | 409 | 454 | 487 | 385 | 634 | 448 |
| | total | 1000 | 1000 | 1000 | 984 | 1000 | 1000 |
| Physics | FALSE | 56 | 50 | 70 | 191 | 107 | 109 |
| | total | 227 | 227 | 227 | 227 | 227 | 227 |
| FEVER | FALSE | 485 | 432 | 441 | 449 | 570 | - |
| | total | 1000 | 1000 | 1000 | 1000 | 1000 | - |
| HotpotQA | FALSE | 217 | 175 | 192 | 219 | 199 | - |
| | total | 308 | 308 | 308 | 308 | 308 | - |
| 2WikiMultihop | FALSE | 626 | 598 | 401 | 629 | 562 | - |
| | total | 1000 | 1000 | 1000 | 1000 | 1000 | - |
### Citation Information
```bibtex
@misc{xu2024verify,
title={Can We Verify Step by Step for Incorrect Answer Detection?},
author={Xin Xu and Shizhe Diao and Can Yang and Yang Wang},
year={2024},
eprint={2402.10528},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
heliosprime/twitter_dataset_1713018201 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 9791
num_examples: 23
download_size: 9188
dataset_size: 9791
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713018201"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ericpolewski__Palworld-SME-13b | ---
pretty_name: Evaluation run of ericpolewski/Palworld-SME-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ericpolewski/Palworld-SME-13b](https://huggingface.co/ericpolewski/Palworld-SME-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ericpolewski__Palworld-SME-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T12:30:34.834503](https://huggingface.co/datasets/open-llm-leaderboard/details_ericpolewski__Palworld-SME-13b/blob/main/results_2024-02-09T12-30-34.834503.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.532296003908677,\n\
\ \"acc_stderr\": 0.033825002823228846,\n \"acc_norm\": 0.5413466673673525,\n\
\ \"acc_norm_stderr\": 0.034679022812202726,\n \"mc1\": 0.3243574051407589,\n\
\ \"mc1_stderr\": 0.016387976779647935,\n \"mc2\": 0.4666625095183999,\n\
\ \"mc2_stderr\": 0.015175138209414976\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5162116040955631,\n \"acc_stderr\": 0.014603708567414945,\n\
\ \"acc_norm\": 0.5554607508532423,\n \"acc_norm_stderr\": 0.014521226405627075\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6077474606652061,\n\
\ \"acc_stderr\": 0.004872546302641848,\n \"acc_norm\": 0.808105954989046,\n\
\ \"acc_norm_stderr\": 0.003929854025801025\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n\
\ \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854498,\n\
\ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854498\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006716,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006716\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.031967586978353627,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.031967586978353627\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489361,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489361\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30687830687830686,\n \"acc_stderr\": 0.023752928712112133,\n \"\
acc_norm\": 0.30687830687830686,\n \"acc_norm_stderr\": 0.023752928712112133\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.049020713000019756,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.049020713000019756\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6258064516129033,\n \"acc_stderr\": 0.0275289042998457,\n \"acc_norm\"\
: 0.6258064516129033,\n \"acc_norm_stderr\": 0.0275289042998457\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3694581280788177,\n\
\ \"acc_stderr\": 0.03395970381998574,\n \"acc_norm\": 0.3694581280788177,\n\
\ \"acc_norm_stderr\": 0.03395970381998574\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \
\ \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.033586181457325226,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033586181457325226\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817234,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817234\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.02535100632816969,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.02535100632816969\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815635,\n\
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815635\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413925,\n \
\ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413925\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659809,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659809\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7376146788990826,\n \"acc_stderr\": 0.018861885021534734,\n \"\
acc_norm\": 0.7376146788990826,\n \"acc_norm_stderr\": 0.018861885021534734\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696043,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036416,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036416\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416828,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416828\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.042607351576445594,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.042607351576445594\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.02685345037700917,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.02685345037700917\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.719029374201788,\n\
\ \"acc_stderr\": 0.01607312785122122,\n \"acc_norm\": 0.719029374201788,\n\
\ \"acc_norm_stderr\": 0.01607312785122122\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5982658959537572,\n \"acc_stderr\": 0.026394104177643634,\n\
\ \"acc_norm\": 0.5982658959537572,\n \"acc_norm_stderr\": 0.026394104177643634\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3094972067039106,\n\
\ \"acc_stderr\": 0.01546116900237154,\n \"acc_norm\": 0.3094972067039106,\n\
\ \"acc_norm_stderr\": 0.01546116900237154\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5620915032679739,\n \"acc_stderr\": 0.02840830202033269,\n\
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.02840830202033269\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\
\ \"acc_stderr\": 0.027417996705630995,\n \"acc_norm\": 0.6302250803858521,\n\
\ \"acc_norm_stderr\": 0.027417996705630995\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6450617283950617,\n \"acc_stderr\": 0.02662415247884585,\n\
\ \"acc_norm\": 0.6450617283950617,\n \"acc_norm_stderr\": 0.02662415247884585\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573086,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573086\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n\
\ \"acc_stderr\": 0.01268590653820624,\n \"acc_norm\": 0.4426336375488918,\n\
\ \"acc_norm_stderr\": 0.01268590653820624\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n\
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5490196078431373,\n \"acc_stderr\": 0.020130388312904524,\n \
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.020130388312904524\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5510204081632653,\n \"acc_stderr\": 0.03184213866687579,\n\
\ \"acc_norm\": 0.5510204081632653,\n \"acc_norm_stderr\": 0.03184213866687579\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n\
\ \"acc_stderr\": 0.03265819588512697,\n \"acc_norm\": 0.6915422885572139,\n\
\ \"acc_norm_stderr\": 0.03265819588512697\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3243574051407589,\n\
\ \"mc1_stderr\": 0.016387976779647935,\n \"mc2\": 0.4666625095183999,\n\
\ \"mc2_stderr\": 0.015175138209414976\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7482241515390686,\n \"acc_stderr\": 0.012198489100259781\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.021986353297952996,\n \
\ \"acc_stderr\": 0.004039162758110039\n }\n}\n```"
repo_url: https://huggingface.co/ericpolewski/Palworld-SME-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|arc:challenge|25_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|gsm8k|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hellaswag|10_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T12-30-34.834503.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T12-30-34.834503.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- '**/details_harness|winogrande|5_2024-02-09T12-30-34.834503.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T12-30-34.834503.parquet'
- config_name: results
data_files:
- split: 2024_02_09T12_30_34.834503
path:
- results_2024-02-09T12-30-34.834503.parquet
- split: latest
path:
- results_2024-02-09T12-30-34.834503.parquet
---
# Dataset Card for Evaluation run of ericpolewski/Palworld-SME-13b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ericpolewski/Palworld-SME-13b](https://huggingface.co/ericpolewski/Palworld-SME-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ericpolewski__Palworld-SME-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T12:30:34.834503](https://huggingface.co/datasets/open-llm-leaderboard/details_ericpolewski__Palworld-SME-13b/blob/main/results_2024-02-09T12-30-34.834503.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.532296003908677,
"acc_stderr": 0.033825002823228846,
"acc_norm": 0.5413466673673525,
"acc_norm_stderr": 0.034679022812202726,
"mc1": 0.3243574051407589,
"mc1_stderr": 0.016387976779647935,
"mc2": 0.4666625095183999,
"mc2_stderr": 0.015175138209414976
},
"harness|arc:challenge|25": {
"acc": 0.5162116040955631,
"acc_stderr": 0.014603708567414945,
"acc_norm": 0.5554607508532423,
"acc_norm_stderr": 0.014521226405627075
},
"harness|hellaswag|10": {
"acc": 0.6077474606652061,
"acc_stderr": 0.004872546302641848,
"acc_norm": 0.808105954989046,
"acc_norm_stderr": 0.003929854025801025
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.030242233800854498,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.030242233800854498
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.037724468575180255,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.037724468575180255
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006716,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006716
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489361,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489361
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30687830687830686,
"acc_stderr": 0.023752928712112133,
"acc_norm": 0.30687830687830686,
"acc_norm_stderr": 0.023752928712112133
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.049020713000019756,
"acc_norm": 0.39,
"acc_norm_stderr": 0.049020713000019756
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6258064516129033,
"acc_stderr": 0.0275289042998457,
"acc_norm": 0.6258064516129033,
"acc_norm_stderr": 0.0275289042998457
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.03395970381998574,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.03395970381998574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033586181457325226,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033586181457325226
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.029519282616817234,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.029519282616817234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5,
"acc_stderr": 0.02535100632816969,
"acc_norm": 0.5,
"acc_norm_stderr": 0.02535100632816969
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815635,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815635
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.03181110032413925,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.03181110032413925
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659809,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659809
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7376146788990826,
"acc_stderr": 0.018861885021534734,
"acc_norm": 0.7376146788990826,
"acc_norm_stderr": 0.018861885021534734
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696043,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036416,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036416
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416828,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416828
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.02685345037700917,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.02685345037700917
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.719029374201788,
"acc_stderr": 0.01607312785122122,
"acc_norm": 0.719029374201788,
"acc_norm_stderr": 0.01607312785122122
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5982658959537572,
"acc_stderr": 0.026394104177643634,
"acc_norm": 0.5982658959537572,
"acc_norm_stderr": 0.026394104177643634
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3094972067039106,
"acc_stderr": 0.01546116900237154,
"acc_norm": 0.3094972067039106,
"acc_norm_stderr": 0.01546116900237154
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.02840830202033269,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.02840830202033269
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.027417996705630995,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.027417996705630995
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6450617283950617,
"acc_stderr": 0.02662415247884585,
"acc_norm": 0.6450617283950617,
"acc_norm_stderr": 0.02662415247884585
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573086,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573086
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4426336375488918,
"acc_stderr": 0.01268590653820624,
"acc_norm": 0.4426336375488918,
"acc_norm_stderr": 0.01268590653820624
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.020130388312904524,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.020130388312904524
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5510204081632653,
"acc_stderr": 0.03184213866687579,
"acc_norm": 0.5510204081632653,
"acc_norm_stderr": 0.03184213866687579
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6915422885572139,
"acc_stderr": 0.03265819588512697,
"acc_norm": 0.6915422885572139,
"acc_norm_stderr": 0.03265819588512697
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3243574051407589,
"mc1_stderr": 0.016387976779647935,
"mc2": 0.4666625095183999,
"mc2_stderr": 0.015175138209414976
},
"harness|winogrande|5": {
"acc": 0.7482241515390686,
"acc_stderr": 0.012198489100259781
},
"harness|gsm8k|5": {
"acc": 0.021986353297952996,
"acc_stderr": 0.004039162758110039
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-source-metrics/pytorch-image-models-dependents | ---
license: apache-2.0
pretty_name: pytorch-image-models metrics
tags:
- github-stars
dataset_info:
features:
- name: name
dtype: 'null'
- name: stars
dtype: 'null'
- name: forks
dtype: 'null'
splits:
- name: package
- name: repository
download_size: 1798
dataset_size: 0
---
# pytorch-image-models metrics
This dataset contains metrics about the huggingface/pytorch-image-models package.
Number of repositories in the dataset: 3615
Number of packages in the dataset: 89
## Package dependents
This contains the data available in the [used-by](https://github.com/huggingface/pytorch-image-models/network/dependents)
tab on GitHub.
### Package & Repository star count
This section shows the package and repository star count, individually.
Package | Repository
:-------------------------:|:-------------------------:
 | 
There are 18 packages that have more than 1000 stars.
There are 39 repositories that have more than 1000 stars.
The top 10 in each category are the following:
*Package*
[huggingface/transformers](https://github.com/huggingface/transformers): 70536
[fastai/fastai](https://github.com/fastai/fastai): 22776
[open-mmlab/mmdetection](https://github.com/open-mmlab/mmdetection): 21390
[MVIG-SJTU/AlphaPose](https://github.com/MVIG-SJTU/AlphaPose): 6424
[qubvel/segmentation_models.pytorch](https://github.com/qubvel/segmentation_models.pytorch): 6115
[awslabs/autogluon](https://github.com/awslabs/autogluon): 4818
[neuml/txtai](https://github.com/neuml/txtai): 2531
[open-mmlab/mmaction2](https://github.com/open-mmlab/mmaction2): 2357
[open-mmlab/mmselfsup](https://github.com/open-mmlab/mmselfsup): 2271
[lukas-blecher/LaTeX-OCR](https://github.com/lukas-blecher/LaTeX-OCR): 1999
*Repository*
[huggingface/transformers](https://github.com/huggingface/transformers): 70536
[commaai/openpilot](https://github.com/commaai/openpilot): 35919
[facebookresearch/detectron2](https://github.com/facebookresearch/detectron2): 22287
[ray-project/ray](https://github.com/ray-project/ray): 22057
[open-mmlab/mmdetection](https://github.com/open-mmlab/mmdetection): 21390
[NVIDIA/DeepLearningExamples](https://github.com/NVIDIA/DeepLearningExamples): 9260
[microsoft/unilm](https://github.com/microsoft/unilm): 6664
[pytorch/tutorials](https://github.com/pytorch/tutorials): 6331
[qubvel/segmentation_models.pytorch](https://github.com/qubvel/segmentation_models.pytorch): 6115
[hpcaitech/ColossalAI](https://github.com/hpcaitech/ColossalAI): 4944
### Package & Repository fork count
This section shows the package and repository fork count, individually.
Package | Repository
:-------------------------:|:-------------------------:
 | 
There are 12 packages that have more than 200 forks.
There are 28 repositories that have more than 200 forks.
The top 10 in each category are the following:
*Package*
[huggingface/transformers](https://github.com/huggingface/transformers): 16175
[open-mmlab/mmdetection](https://github.com/open-mmlab/mmdetection): 7791
[fastai/fastai](https://github.com/fastai/fastai): 7296
[MVIG-SJTU/AlphaPose](https://github.com/MVIG-SJTU/AlphaPose): 1765
[qubvel/segmentation_models.pytorch](https://github.com/qubvel/segmentation_models.pytorch): 1217
[open-mmlab/mmaction2](https://github.com/open-mmlab/mmaction2): 787
[awslabs/autogluon](https://github.com/awslabs/autogluon): 638
[open-mmlab/mmselfsup](https://github.com/open-mmlab/mmselfsup): 321
[rwightman/efficientdet-pytorch](https://github.com/rwightman/efficientdet-pytorch): 265
[lukas-blecher/LaTeX-OCR](https://github.com/lukas-blecher/LaTeX-OCR): 247
*Repository*
[huggingface/transformers](https://github.com/huggingface/transformers): 16175
[open-mmlab/mmdetection](https://github.com/open-mmlab/mmdetection): 7791
[commaai/openpilot](https://github.com/commaai/openpilot): 6603
[facebookresearch/detectron2](https://github.com/facebookresearch/detectron2): 6033
[ray-project/ray](https://github.com/ray-project/ray): 3879
[pytorch/tutorials](https://github.com/pytorch/tutorials): 3478
[NVIDIA/DeepLearningExamples](https://github.com/NVIDIA/DeepLearningExamples): 2499
[microsoft/unilm](https://github.com/microsoft/unilm): 1223
[qubvel/segmentation_models.pytorch](https://github.com/qubvel/segmentation_models.pytorch): 1217
[layumi/Person_reID_baseline_pytorch](https://github.com/layumi/Person_reID_baseline_pytorch): 928
|
CyberHarem/kiba_manami_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kiba_manami/木場真奈美 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of kiba_manami/木場真奈美 (THE iDOLM@STER: Cinderella Girls), containing 73 images and their tags.
The core tags of this character are `short_hair, green_eyes, brown_hair, breasts, large_breasts, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 73 | 75.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiba_manami_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 73 | 49.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiba_manami_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 154 | 95.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiba_manami_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 73 | 69.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiba_manami_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 154 | 125.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiba_manami_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kiba_manami_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, cleavage, smile, solo, necklace, bracelet, fingerless_gloves, looking_at_viewer, black_gloves, midriff, belt, black_shorts, hair_between_eyes, holding_microphone, medium_breasts, navel, simple_background, thighhighs, black_footwear, character_name, open_jacket, open_mouth, short_sleeves, standing, thigh_boots |
| 1 | 5 |  |  |  |  |  | 1girl, smile, solo, character_name, medium_breasts, pants, belt, card_(medium), cleavage, gem_(symbol), looking_at_viewer, blue_background, frills, hat_removed, necklace |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | smile | solo | necklace | bracelet | fingerless_gloves | looking_at_viewer | black_gloves | midriff | belt | black_shorts | hair_between_eyes | holding_microphone | medium_breasts | navel | simple_background | thighhighs | black_footwear | character_name | open_jacket | open_mouth | short_sleeves | standing | thigh_boots | pants | card_(medium) | gem_(symbol) | blue_background | frills | hat_removed |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------|:-------|:-----------|:-----------|:--------------------|:--------------------|:---------------|:----------|:-------|:---------------|:--------------------|:---------------------|:-----------------|:--------|:--------------------|:-------------|:-----------------|:-----------------|:--------------|:-------------|:----------------|:-----------|:--------------|:--------|:----------------|:---------------|:------------------|:---------|:--------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | | X | | | X | | | | X | | | | | X | | | | | | X | X | X | X | X | X |
|
iarbel/od_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: height
dtype: int64
- name: width
dtype: int64
- name: image_id
dtype: string
- name: objects
struct:
- name: area
sequence: int64
- name: bbox
sequence:
sequence: int64
- name: category
sequence: int64
splits:
- name: train
num_bytes: 4345863.0
num_examples: 80
- name: test
num_bytes: 1017795.0
num_examples: 19
download_size: 5262915
dataset_size: 5363658.0
---
# Dataset Card for "od_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Lithicsoft/Moloom-Guanaco-9k | ---
language:
- en
- es
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 15401731
num_examples: 9846
download_size: 9094493
dataset_size: 15401731
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
morgendigital/dialect-at-tirol | ---
license: apache-2.0
task_categories:
- text-generation
language:
- de
pretty_name: 'Austrian Dialect: Tyrolean'
size_categories:
- n<1K
---
# Dataset of Tyrolean Dialect (Austria)
This dataset contains 200+ words used in Tirol (Austria), together with their German translation and (optional) meaning. |
Fakhraddin/NLMCXR | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: text
dtype: string
- name: path
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 1085509616.475
num_examples: 5925
- name: validation
num_bytes: 273304928.6
num_examples: 1505
download_size: 1362990038
dataset_size: 1358814545.0749998
---
# Dataset Card for "NLMCXR"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jorgvt/TID2008 | ---
tags:
- image-quality
pretty_name: TAMPERE IMAGE DATABASE 2008
size_categories:
- 1K<n<10K
---
Image Quality Assessment Dataset consisting of 25 reference images, 17 different distortions and 4 intensities per distortion. In total there are 1700 (reference, distortion, MOS) tuples. |
mcorsa/swifterX-4k-clean | ---
license: apache-2.0
---
|
pattern123/sidewalk-imagery | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 3138394.0
num_examples: 10
download_size: 3139599
dataset_size: 3138394.0
---
# Dataset Card for "sidewalk-imagery"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Xiangyuden/Network-Fusion | ---
license: mit
---
|
sahilkadge/demo | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': dev
'1': test
'2': train
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 39043212.0
num_examples: 49
- name: validation
num_bytes: 980846.0
num_examples: 1
- name: test
num_bytes: 5066562.0
num_examples: 7
download_size: 44985287
dataset_size: 45090620.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
liuyanchen1015/MULTI_VALUE_sst2_perfect_already | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 3411
num_examples: 23
- name: test
num_bytes: 9243
num_examples: 57
- name: train
num_bytes: 130063
num_examples: 968
download_size: 67205
dataset_size: 142717
---
# Dataset Card for "MULTI_VALUE_sst2_perfect_already"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nuvocare/Ted2020_en_es_fr_de_it_ca_pl_ru_nl | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: de
dtype: string
- name: en
dtype: string
- name: es
dtype: string
- name: fr
dtype: string
- name: it
dtype: string
- name: nl
dtype: string
- name: pl
dtype: string
- name: ru
dtype: string
splits:
- name: train
num_bytes: 191053803
num_examples: 258098
- name: test
num_bytes: 4930156
num_examples: 7213
- name: validation
num_bytes: 4326695
num_examples: 6049
download_size: 116856833
dataset_size: 200310654
---
# Dataset Card for "Ted2020_en_es_fr_de_it_ca_pl_ru_nl"
This dataset is an extract of the TED2020 corpora focusing only on english, french, german, italian, polish, russian and dutch.
It is used for the purpose of building multilingual biomedical language models.
Teacher model is asked to encode the english sentence.
Student model is asked to encode other sentences by minimizng the euclidean distance with the teacher encoding.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alpayariyak/MATH_Instruct_no_input | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 9423883
num_examples: 12500
download_size: 4856922
dataset_size: 9423883
---
# Dataset Card for "MATH_Instruct_no_input"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-college_medicine-rule-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 168408
num_examples: 173
download_size: 87028
dataset_size: 168408
---
# Dataset Card for "mmlu-college_medicine-rule-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zeppelin-43/digging_fps_yt_seg_sample_heap | ---
dataset_info:
features:
- name: image
dtype: image
- name: name
dtype: string
- name: condition
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 3036459295.89
num_examples: 3722
download_size: 2733884336
dataset_size: 3036459295.89
---
# Dataset Card for "digging_fps_yt_seg_sample_heap"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ittailup/lallama-data-chat | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 8086191762
num_examples: 1054559
download_size: 4359870365
dataset_size: 8086191762
---
# Dataset Card for "lallama-data-chat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Finnish-NLP/ultrafeedback_deepl_sft_dpo_filtered | ---
language:
- fi
license: mit
task_categories:
- text-generation
dataset_info:
features:
- name: instruction
dtype: string
- name: response_accepted
dtype: string
- name: response_rejected
dtype: string
- name: instruction_perplexity_kenlm
dtype: int64
- name: chosen_response_perplexity_kenlm
dtype: int64
- name: rejected_response_perplexity_kenlm
dtype: int64
- name: combined_perplexity_dpo
dtype: int64
- name: combined_perplexity_sft
dtype: int64
- name: instruction_lang
dtype: string
- name: instruction_lang_proba
dtype: float64
- name: chosen_response_lang
dtype: string
- name: chosen_response_lang_proba
dtype: float64
- name: rejected_response_lang
dtype: string
- name: rejected_response_lang_proba
dtype: float64
- name: perplexity_instruction_len_ratio
dtype: float64
- name: perplexity_response_len_ratio
dtype: float64
- name: dataset_source
dtype: string
- name: __index_level_0__
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 74380857
num_examples: 12712
download_size: 42245567
dataset_size: 74380857
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for Finnish-NLP/ultrafeedback_deepl_sft_dpo_filtered
## Creation process
- Load data from https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized/viewer/default/train_sft
- Do zero shot classification with facebook/bart-large-mnli in this kind of way (Actual implementation might be slightly different):
```python
preds = pipe(f'{row["instruction"]} is a question about:', candidate_labels=["USA related question", "Math related question", "General question", "Coding related question"])
```
- Filter out rows with too high scores in following categories ["USA related question", "Math related question","Coding related question"]
- Write rows to .txt file with *** on a newline separating instruction/accepted_response/rejected_response and then END on a newline separating samples
- Upload file to deepl.com for file translation --> parse samples back from translated files --> Maybe some additional cleaning/filtering based on fasttext langdetect / kenlm perplexity |
dwret/cacaoo8 | ---
license: creativeml-openrail-m
---
|
open-llm-leaderboard/details_gmonsoon__Delta-4B-Base | ---
pretty_name: Evaluation run of gmonsoon/Delta-4B-Base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gmonsoon/Delta-4B-Base](https://huggingface.co/gmonsoon/Delta-4B-Base) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gmonsoon__Delta-4B-Base\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-07T14:03:55.643123](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__Delta-4B-Base/blob/main/results_2024-03-07T14-03-55.643123.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5905528687838363,\n\
\ \"acc_stderr\": 0.03352325174942067,\n \"acc_norm\": 0.5934267748828804,\n\
\ \"acc_norm_stderr\": 0.034203989694081526,\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046042,\n \"mc2\": 0.5173591737183307,\n\
\ \"mc2_stderr\": 0.016143767707448048\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5674061433447098,\n \"acc_stderr\": 0.014478005694182526,\n\
\ \"acc_norm\": 0.5861774744027304,\n \"acc_norm_stderr\": 0.014392730009221004\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5875323640709023,\n\
\ \"acc_stderr\": 0.0049127238489447955,\n \"acc_norm\": 0.7628958374825732,\n\
\ \"acc_norm_stderr\": 0.004244374809273614\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.02977308271331987,\n\
\ \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.02977308271331987\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663434,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663434\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4470899470899471,\n \"acc_stderr\": 0.025606723995777025,\n \"\
acc_norm\": 0.4470899470899471,\n \"acc_norm_stderr\": 0.025606723995777025\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7096774193548387,\n\
\ \"acc_stderr\": 0.025822106119415895,\n \"acc_norm\": 0.7096774193548387,\n\
\ \"acc_norm_stderr\": 0.025822106119415895\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.024468615241478926,\n\
\ \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.024468615241478926\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.0287420409039485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.0287420409039485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.03142946637883708,\n \
\ \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.03142946637883708\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7963302752293578,\n \"acc_stderr\": 0.01726674208763082,\n \"\
acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.01726674208763082\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.03409386946992699,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.03409386946992699\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.03058759135160425,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.03058759135160425\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.02845882099146029,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.02845882099146029\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.026453508054040332,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.026453508054040332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6909323116219668,\n\
\ \"acc_stderr\": 0.016524988919702208,\n \"acc_norm\": 0.6909323116219668,\n\
\ \"acc_norm_stderr\": 0.016524988919702208\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22905027932960895,\n\
\ \"acc_stderr\": 0.014054314935614562,\n \"acc_norm\": 0.22905027932960895,\n\
\ \"acc_norm_stderr\": 0.014054314935614562\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.02773283435336394,\n\
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.02773283435336394\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\
\ \"acc_stderr\": 0.02685882587948854,\n \"acc_norm\": 0.662379421221865,\n\
\ \"acc_norm_stderr\": 0.02685882587948854\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.026774929899722324,\n\
\ \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.026774929899722324\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42698826597131684,\n\
\ \"acc_stderr\": 0.012633353557534427,\n \"acc_norm\": 0.42698826597131684,\n\
\ \"acc_norm_stderr\": 0.012633353557534427\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.030332578094555026,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.030332578094555026\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5866013071895425,\n \"acc_stderr\": 0.019922115682786685,\n \
\ \"acc_norm\": 0.5866013071895425,\n \"acc_norm_stderr\": 0.019922115682786685\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.03002105623844031,\n\
\ \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.03002105623844031\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.029475250236017183,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.029475250236017183\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046042,\n \"mc2\": 0.5173591737183307,\n\
\ \"mc2_stderr\": 0.016143767707448048\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7363851617995264,\n \"acc_stderr\": 0.012382849299658468\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.46929492039423804,\n \
\ \"acc_stderr\": 0.013746490739560042\n }\n}\n```"
repo_url: https://huggingface.co/gmonsoon/Delta-4B-Base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|arc:challenge|25_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|gsm8k|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hellaswag|10_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T14-03-55.643123.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T14-03-55.643123.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- '**/details_harness|winogrande|5_2024-03-07T14-03-55.643123.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-07T14-03-55.643123.parquet'
- config_name: results
data_files:
- split: 2024_03_07T14_03_55.643123
path:
- results_2024-03-07T14-03-55.643123.parquet
- split: latest
path:
- results_2024-03-07T14-03-55.643123.parquet
---
# Dataset Card for Evaluation run of gmonsoon/Delta-4B-Base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gmonsoon/Delta-4B-Base](https://huggingface.co/gmonsoon/Delta-4B-Base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gmonsoon__Delta-4B-Base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-07T14:03:55.643123](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__Delta-4B-Base/blob/main/results_2024-03-07T14-03-55.643123.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5905528687838363,
"acc_stderr": 0.03352325174942067,
"acc_norm": 0.5934267748828804,
"acc_norm_stderr": 0.034203989694081526,
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046042,
"mc2": 0.5173591737183307,
"mc2_stderr": 0.016143767707448048
},
"harness|arc:challenge|25": {
"acc": 0.5674061433447098,
"acc_stderr": 0.014478005694182526,
"acc_norm": 0.5861774744027304,
"acc_norm_stderr": 0.014392730009221004
},
"harness|hellaswag|10": {
"acc": 0.5875323640709023,
"acc_stderr": 0.0049127238489447955,
"acc_norm": 0.7628958374825732,
"acc_norm_stderr": 0.004244374809273614
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.02977308271331987,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.02977308271331987
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663434,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663434
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4470899470899471,
"acc_stderr": 0.025606723995777025,
"acc_norm": 0.4470899470899471,
"acc_norm_stderr": 0.025606723995777025
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7096774193548387,
"acc_stderr": 0.025822106119415895,
"acc_norm": 0.7096774193548387,
"acc_norm_stderr": 0.025822106119415895
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.024468615241478926,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.024468615241478926
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0287420409039485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0287420409039485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.03142946637883708,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.03142946637883708
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.01726674208763082,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.01726674208763082
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.03409386946992699,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.03409386946992699
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.03058759135160425,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.03058759135160425
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.02845882099146029,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.02845882099146029
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.026453508054040332,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.026453508054040332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6909323116219668,
"acc_stderr": 0.016524988919702208,
"acc_norm": 0.6909323116219668,
"acc_norm_stderr": 0.016524988919702208
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22905027932960895,
"acc_stderr": 0.014054314935614562,
"acc_norm": 0.22905027932960895,
"acc_norm_stderr": 0.014054314935614562
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.02773283435336394,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.02773283435336394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.02685882587948854,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.02685882587948854
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6358024691358025,
"acc_stderr": 0.026774929899722324,
"acc_norm": 0.6358024691358025,
"acc_norm_stderr": 0.026774929899722324
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42698826597131684,
"acc_stderr": 0.012633353557534427,
"acc_norm": 0.42698826597131684,
"acc_norm_stderr": 0.012633353557534427
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.030332578094555026,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.030332578094555026
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5866013071895425,
"acc_stderr": 0.019922115682786685,
"acc_norm": 0.5866013071895425,
"acc_norm_stderr": 0.019922115682786685
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.03002105623844031,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.03002105623844031
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017183,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017183
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046042,
"mc2": 0.5173591737183307,
"mc2_stderr": 0.016143767707448048
},
"harness|winogrande|5": {
"acc": 0.7363851617995264,
"acc_stderr": 0.012382849299658468
},
"harness|gsm8k|5": {
"acc": 0.46929492039423804,
"acc_stderr": 0.013746490739560042
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AlekseyKorshuk/davinci-pairwise-medium | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 2530035200
num_examples: 64759
- name: test
num_bytes: 36178476
num_examples: 7195
download_size: 848422865
dataset_size: 2566213676
---
# Dataset Card for "davinci-pairwise-medium"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sem_eval_2014_task_1 | ---
annotations_creators:
- crowdsourced
language_creators:
- expert-generated
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- extended|other-ImageFlickr and SemEval-2012 STS MSR-Video Descriptions
task_categories:
- text-classification
task_ids:
- text-scoring
- natural-language-inference
- semantic-similarity-scoring
pretty_name: SemEval 2014 - Task 1
dataset_info:
features:
- name: sentence_pair_id
dtype: int64
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: relatedness_score
dtype: float32
- name: entailment_judgment
dtype:
class_label:
names:
'0': NEUTRAL
'1': ENTAILMENT
'2': CONTRADICTION
splits:
- name: train
num_bytes: 540296
num_examples: 4500
- name: test
num_bytes: 592320
num_examples: 4927
- name: validation
num_bytes: 60981
num_examples: 500
download_size: 197230
dataset_size: 1193597
---
# Dataset Card for SemEval 2014 - Task 1
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [SemEval-2014 Task 1](https://alt.qcri.org/semeval2014/task1/)
- **Repository:**
- **Paper:** [Aclweb](https://www.aclweb.org/anthology/S14-2001/)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@ashmeet13](https://github.com/ashmeet13) for adding this dataset. |
lizziepika/starwarsquotes | ---
license: apache-2.0
---
|
ggul-tiger/negobot_userdata | ---
dataset_info:
features:
- name: title
dtype: string
- name: description
dtype: string
- name: price
dtype: int64
- name: result
dtype: string
- name: events
list:
- name: message
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 15871
num_examples: 15
download_size: 11138
dataset_size: 15871
---
# Dataset Card for "negobot-userdata"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arka0821/multi_document_summarization | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- summarization
task_ids:
- summarization-other-paper-abstract-generation
paperswithcode_id: multi-document
pretty_name: Multi-Document
---
# Dataset Card for Multi-Document
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** [Multi-Document repository](https://github.com/arka0821/multi_document_summarization)
- **Paper:** [Multi-Document: A Large-scale Dataset for Extreme Multi-document Summarization of Scientific Articles](https://arxiv.org/abs/2010.14235)
### Dataset Summary
Multi-Document, a large-scale multi-document summarization dataset created from scientific articles. Multi-Document introduces a challenging multi-document summarization task: writing the related-work section of a paper based on its abstract and the articles it references.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The text in the dataset is in English
## Dataset Structure
### Data Instances
{"id": "n3ByHGrxH3bvfrvF", "docs": [{"id": "1394519630182457344", "text": "Clover Bio's COVID-19 vaccine candidate shows immune response against SARS-CoV-2 variants in mouse model https://t.co/wNWa9GQux5"}, {"id": "1398154482463170561", "text": "The purpose of the Vaccine is not to stop you from catching COVID 19. The vaccine introduces the immune system to an inactivated form of the SARS-CoV-2 coronavirus or a small part of it. This then equips the body with the ability to fight the virus better in case you get it. https://t.co/Cz9OU6Zi7P"}, {"id": "1354844652520792071", "text": "The Moderna mRNA COVID-19 vaccine appears to be effective against the novel, rapidly spreading variants of SARS-CoV-2.\nResearchers analysed blood samples from vaccinated people and monkeys- Both contained neutralising antibodies against the virus. \nPT1/2\n#COVID19vaccines #biotech https://t.co/ET1maJznot"}, {"id": "1340189698107518976", "text": "@KhandaniM Pfizer vaccine introduces viral surface protein which is constant accross SARS COV 2 variants into the body. Body builds antibodies against this protein, not any virus. These antibodies instructs macrophages & T-Cells to attack & destroy any COVID-19 v variant at infection point"}, {"id": "1374368989581778945", "text": "@DelthiaRicks \" Pfizer and BioNTech\u2019s COVID-19 vaccine is an mRNA vaccine, which does not use the live virus but rather a small portion of the viral sequence of the SARS-CoV-2 virus to instruct the body to produce the spike protein displayed on the surface of the virus.\""}, {"id": "1353354819315126273", "text": "Pfizer and BioNTech Publish Results of Study Showing COVID-19 Vaccine Elicits Antibodies that Neutralize Pseudovirus Bearing the SARS-CoV-2 U.K. Strain Spike Protein in Cell Culture | Pfizer https://t.co/YXcSnjLt8C"}, {"id": "1400821856362401792", "text": "Pfizer-BioNTech's covid-19 vaccine elicits lower levels of antibodies against the SARS-CoV-2\u00a0Delta variant\u00a0(B.1.617.2), first discovered in India, in comparison to other variants, said a research published in\u00a0Lancet\u00a0journal.\n https://t.co/IaCMX81X3b"}, {"id": "1367252963190665219", "text": "New research from UNC-Chapel Hill suggests that those who have previously experienced a SARS-CoV-2 infection develop a significant antibody response to the first dose of mRNA-based COVID-19 vaccine.\nhttps://t.co/B4vR1KUQ0w"}, {"id": "1375949502461394946", "text": "Mechanism of a COVID-19 nanoparticle vaccine candidate that elicits a broadly neutralizing antibody response to SARS-CoV-2 variants https://t.co/nc1L0uvtlI #bioRxiv"}, {"id": "1395428608349548550", "text": "JCI - Efficient maternal to neonatal transfer of antibodies against SARS-CoV-2 and BNT162b2 mRNA COVID-19 vaccine https://t.co/vIBcpPaKFZ"}], "summary": "The COVID-19 vaccine appears to be effective against the novel, rapidly spreading variants of SARS-CoV-2. Pfizer-BioNTech's COVID-19 vaccine use small portion of the viral sequence of the SARS-CoV-2 virus to equip the body with the ability to fight the virus better in case you get it."}
### Data Fields
{'id': text of paper abstract \
'docs': document id \
[
'id': id of text \
'text': text data \
]
'summary': summary text
}
### Data Splits
The data is split into a training, validation and test.
| train | validation | test |
|------:|-----------:|-----:|
| 50 | 10 | 5 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@article{lu2020multi,
title={Multi-Document: A Large-scale Dataset for Extreme Multi-document Summarization of Scientific Articles},
author={Arka Das, India},
journal={arXiv preprint arXiv:2010.14235},
year={2022}
}
```
### Contributions
Thanks to [@arka0821] (https://github.com/arka0821/multi_document_summarization) for adding this dataset.
|
rubend18/CIE10 | ---
task_categories:
- text-classification
- token-classification
language:
- es
tags:
- salud
- health
- diagnóstico
- ICD10Codes
- MedicalCoding
- HealthcareClassification
- DiseaseClassification
- ICD10Diagnosis
- MedicalTerminology
- HealthData
- ClinicalCoding
- HealthcareStandards
- MedicalClassification
- CódigosCIE10
- CodificaciónMédica
- ClasificaciónSanitaria
- ClasificaciónEnfermedades
- DiagnósticoCIE10
- TerminologíaMédica
- DatosSalud
- CodificaciónClínica
- EstándaresSanitarios
- ClasificaciónMédica
pretty_name: Diagnósticos Médicos CIE10
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Autor:** Rubén Darío Jaramillo
- **Email:** rubend18@hotmail.com
- **WhatsApp:** +593 93 979 6676
### Dataset Summary
CIE10 is the 10th revision of the International Statistical Classification of Diseases and Related Health Problems (ICD), a medical classification list by the World Health Organization (WHO). It contains codes for diseases, signs and symptoms, abnormal findings, complaints, social circumstances, and external causes of injury or diseases. Work on ICD-10 began in 1983, became endorsed by the Forty-third World Health Assembly in 1990, and was first used by member states in 1994. It was replaced by ICD-11 on January 1, 2022.
While WHO manages and publishes the base version of the ICD, several member states have modified it to better suit their needs. In the base classification, the code set allows for more than 14,000 different codes and permits the tracking of many new diagnoses compared to the preceding ICD-9. Through the use of optional sub-classifications, ICD-10 allows for specificity regarding the cause, manifestation, location, severity, and type of injury or disease. The adapted versions may differ in a number of ways, and some national editions have expanded the code set even further; with some going so far as to add procedure codes. ICD-10-CM, for example, has over 70,000 codes.
The WHO provides detailed information regarding the ICD via its website – including an ICD-10 online browser and ICD training materials. The online training includes a support forum, a self-learning tool and user guide.
https://en.wikipedia.org/wiki/ICD-10 |
open-llm-leaderboard/details_MAISAAI__gemma-2b-coder | ---
pretty_name: Evaluation run of MAISAAI/gemma-2b-coder
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MAISAAI/gemma-2b-coder](https://huggingface.co/MAISAAI/gemma-2b-coder) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MAISAAI__gemma-2b-coder\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-24T06:53:52.284429](https://huggingface.co/datasets/open-llm-leaderboard/details_MAISAAI__gemma-2b-coder/blob/main/results_2024-02-24T06-53-52.284429.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.37611773844228785,\n\
\ \"acc_stderr\": 0.03383242673281249,\n \"acc_norm\": 0.3780753668746461,\n\
\ \"acc_norm_stderr\": 0.034575857053117554,\n \"mc1\": 0.2215422276621787,\n\
\ \"mc1_stderr\": 0.014537867601301137,\n \"mc2\": 0.3354424646424683,\n\
\ \"mc2_stderr\": 0.013418718160544026\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4761092150170648,\n \"acc_stderr\": 0.014594701798071654,\n\
\ \"acc_norm\": 0.48976109215017066,\n \"acc_norm_stderr\": 0.014608326906285012\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5354511053574985,\n\
\ \"acc_stderr\": 0.004977223485342017,\n \"acc_norm\": 0.714299940250946,\n\
\ \"acc_norm_stderr\": 0.004508239594503833\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.42,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3849056603773585,\n \"acc_stderr\": 0.029946498567699948,\n\
\ \"acc_norm\": 0.3849056603773585,\n \"acc_norm_stderr\": 0.029946498567699948\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3958333333333333,\n\
\ \"acc_stderr\": 0.04089465449325583,\n \"acc_norm\": 0.3958333333333333,\n\
\ \"acc_norm_stderr\": 0.04089465449325583\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.35260115606936415,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.35260115606936415,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03708284662416544,\n\
\ \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03708284662416544\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39148936170212767,\n \"acc_stderr\": 0.03190701242326812,\n\
\ \"acc_norm\": 0.39148936170212767,\n \"acc_norm_stderr\": 0.03190701242326812\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633356,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633356\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04006168083848876,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04006168083848876\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.38064516129032255,\n\
\ \"acc_stderr\": 0.027621717832907036,\n \"acc_norm\": 0.38064516129032255,\n\
\ \"acc_norm_stderr\": 0.027621717832907036\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.032690808719701876,\n\
\ \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.032690808719701876\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.41818181818181815,\n \"acc_stderr\": 0.03851716319398394,\n\
\ \"acc_norm\": 0.41818181818181815,\n \"acc_norm_stderr\": 0.03851716319398394\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.41919191919191917,\n \"acc_stderr\": 0.035155207286704175,\n \"\
acc_norm\": 0.41919191919191917,\n \"acc_norm_stderr\": 0.035155207286704175\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.41450777202072536,\n \"acc_stderr\": 0.03555300319557673,\n\
\ \"acc_norm\": 0.41450777202072536,\n \"acc_norm_stderr\": 0.03555300319557673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3282051282051282,\n \"acc_stderr\": 0.023807633198657262,\n\
\ \"acc_norm\": 0.3282051282051282,\n \"acc_norm_stderr\": 0.023807633198657262\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31932773109243695,\n \"acc_stderr\": 0.030283995525884396,\n\
\ \"acc_norm\": 0.31932773109243695,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008937,\n \"\
acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008937\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.4917431192660551,\n \"acc_stderr\": 0.021434399918214334,\n \"\
acc_norm\": 0.4917431192660551,\n \"acc_norm_stderr\": 0.021434399918214334\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2361111111111111,\n \"acc_stderr\": 0.02896370257079102,\n \"\
acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.02896370257079102\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.39215686274509803,\n \"acc_stderr\": 0.034267123492472726,\n \"\
acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.034267123492472726\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4050632911392405,\n \"acc_stderr\": 0.03195514741370673,\n \
\ \"acc_norm\": 0.4050632911392405,\n \"acc_norm_stderr\": 0.03195514741370673\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4439461883408072,\n\
\ \"acc_stderr\": 0.03334625674242728,\n \"acc_norm\": 0.4439461883408072,\n\
\ \"acc_norm_stderr\": 0.03334625674242728\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.42748091603053434,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.42748091603053434,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5537190082644629,\n \"acc_stderr\": 0.04537935177947879,\n \"\
acc_norm\": 0.5537190082644629,\n \"acc_norm_stderr\": 0.04537935177947879\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.37037037037037035,\n\
\ \"acc_stderr\": 0.04668408033024932,\n \"acc_norm\": 0.37037037037037035,\n\
\ \"acc_norm_stderr\": 0.04668408033024932\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.31901840490797545,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.31901840490797545,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.44660194174757284,\n \"acc_stderr\": 0.04922424153458933,\n\
\ \"acc_norm\": 0.44660194174757284,\n \"acc_norm_stderr\": 0.04922424153458933\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.03255326307272487,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.03255326307272487\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.508301404853129,\n\
\ \"acc_stderr\": 0.017877498991072,\n \"acc_norm\": 0.508301404853129,\n\
\ \"acc_norm_stderr\": 0.017877498991072\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3988439306358382,\n \"acc_stderr\": 0.026362437574546545,\n\
\ \"acc_norm\": 0.3988439306358382,\n \"acc_norm_stderr\": 0.026362437574546545\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25251396648044694,\n\
\ \"acc_stderr\": 0.014530330201468634,\n \"acc_norm\": 0.25251396648044694,\n\
\ \"acc_norm_stderr\": 0.014530330201468634\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.434640522875817,\n \"acc_stderr\": 0.028384256704883034,\n\
\ \"acc_norm\": 0.434640522875817,\n \"acc_norm_stderr\": 0.028384256704883034\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.41479099678456594,\n\
\ \"acc_stderr\": 0.027982680459759563,\n \"acc_norm\": 0.41479099678456594,\n\
\ \"acc_norm_stderr\": 0.027982680459759563\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.44753086419753085,\n \"acc_stderr\": 0.027667138569422697,\n\
\ \"acc_norm\": 0.44753086419753085,\n \"acc_norm_stderr\": 0.027667138569422697\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590634,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590634\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.32659713168187743,\n\
\ \"acc_stderr\": 0.011977676704715997,\n \"acc_norm\": 0.32659713168187743,\n\
\ \"acc_norm_stderr\": 0.011977676704715997\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.025767252010855956,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.025767252010855956\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.36764705882352944,\n \"acc_stderr\": 0.019506291693954847,\n \
\ \"acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.019506291693954847\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.38181818181818183,\n\
\ \"acc_stderr\": 0.04653429807913508,\n \"acc_norm\": 0.38181818181818183,\n\
\ \"acc_norm_stderr\": 0.04653429807913508\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.37142857142857144,\n \"acc_stderr\": 0.03093285879278986,\n\
\ \"acc_norm\": 0.37142857142857144,\n \"acc_norm_stderr\": 0.03093285879278986\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4427860696517413,\n\
\ \"acc_stderr\": 0.03512310964123937,\n \"acc_norm\": 0.4427860696517413,\n\
\ \"acc_norm_stderr\": 0.03512310964123937\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\
\ \"acc_stderr\": 0.03799857454479636,\n \"acc_norm\": 0.39156626506024095,\n\
\ \"acc_norm_stderr\": 0.03799857454479636\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.03829509868994727,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.03829509868994727\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2215422276621787,\n\
\ \"mc1_stderr\": 0.014537867601301137,\n \"mc2\": 0.3354424646424683,\n\
\ \"mc2_stderr\": 0.013418718160544026\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6685082872928176,\n \"acc_stderr\": 0.013230397198964652\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1607278241091736,\n \
\ \"acc_stderr\": 0.010116708586037183\n }\n}\n```"
repo_url: https://huggingface.co/MAISAAI/gemma-2b-coder
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|arc:challenge|25_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|gsm8k|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hellaswag|10_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-24T06-53-52.284429.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-24T06-53-52.284429.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- '**/details_harness|winogrande|5_2024-02-24T06-53-52.284429.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-24T06-53-52.284429.parquet'
- config_name: results
data_files:
- split: 2024_02_24T06_53_52.284429
path:
- results_2024-02-24T06-53-52.284429.parquet
- split: latest
path:
- results_2024-02-24T06-53-52.284429.parquet
---
# Dataset Card for Evaluation run of MAISAAI/gemma-2b-coder
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MAISAAI/gemma-2b-coder](https://huggingface.co/MAISAAI/gemma-2b-coder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MAISAAI__gemma-2b-coder",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-24T06:53:52.284429](https://huggingface.co/datasets/open-llm-leaderboard/details_MAISAAI__gemma-2b-coder/blob/main/results_2024-02-24T06-53-52.284429.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.37611773844228785,
"acc_stderr": 0.03383242673281249,
"acc_norm": 0.3780753668746461,
"acc_norm_stderr": 0.034575857053117554,
"mc1": 0.2215422276621787,
"mc1_stderr": 0.014537867601301137,
"mc2": 0.3354424646424683,
"mc2_stderr": 0.013418718160544026
},
"harness|arc:challenge|25": {
"acc": 0.4761092150170648,
"acc_stderr": 0.014594701798071654,
"acc_norm": 0.48976109215017066,
"acc_norm_stderr": 0.014608326906285012
},
"harness|hellaswag|10": {
"acc": 0.5354511053574985,
"acc_stderr": 0.004977223485342017,
"acc_norm": 0.714299940250946,
"acc_norm_stderr": 0.004508239594503833
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3849056603773585,
"acc_stderr": 0.029946498567699948,
"acc_norm": 0.3849056603773585,
"acc_norm_stderr": 0.029946498567699948
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3958333333333333,
"acc_stderr": 0.04089465449325583,
"acc_norm": 0.3958333333333333,
"acc_norm_stderr": 0.04089465449325583
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.35260115606936415,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.35260115606936415,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03708284662416544,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03708284662416544
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39148936170212767,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.39148936170212767,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.043391383225798615,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.043391383225798615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633356,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633356
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04006168083848876,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04006168083848876
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.38064516129032255,
"acc_stderr": 0.027621717832907036,
"acc_norm": 0.38064516129032255,
"acc_norm_stderr": 0.027621717832907036
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.032690808719701876,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.032690808719701876
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.03851716319398394,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.03851716319398394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.41919191919191917,
"acc_stderr": 0.035155207286704175,
"acc_norm": 0.41919191919191917,
"acc_norm_stderr": 0.035155207286704175
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.41450777202072536,
"acc_stderr": 0.03555300319557673,
"acc_norm": 0.41450777202072536,
"acc_norm_stderr": 0.03555300319557673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3282051282051282,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.3282051282051282,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31932773109243695,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.31932773109243695,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008937,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008937
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4917431192660551,
"acc_stderr": 0.021434399918214334,
"acc_norm": 0.4917431192660551,
"acc_norm_stderr": 0.021434399918214334
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.02896370257079102,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.02896370257079102
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.034267123492472726,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.034267123492472726
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4050632911392405,
"acc_stderr": 0.03195514741370673,
"acc_norm": 0.4050632911392405,
"acc_norm_stderr": 0.03195514741370673
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4439461883408072,
"acc_stderr": 0.03334625674242728,
"acc_norm": 0.4439461883408072,
"acc_norm_stderr": 0.03334625674242728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.42748091603053434,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.42748091603053434,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5537190082644629,
"acc_stderr": 0.04537935177947879,
"acc_norm": 0.5537190082644629,
"acc_norm_stderr": 0.04537935177947879
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.04668408033024932,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.04668408033024932
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.31901840490797545,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.31901840490797545,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.44660194174757284,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.44660194174757284,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03255326307272487,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03255326307272487
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.508301404853129,
"acc_stderr": 0.017877498991072,
"acc_norm": 0.508301404853129,
"acc_norm_stderr": 0.017877498991072
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.026362437574546545,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.026362437574546545
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25251396648044694,
"acc_stderr": 0.014530330201468634,
"acc_norm": 0.25251396648044694,
"acc_norm_stderr": 0.014530330201468634
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.434640522875817,
"acc_stderr": 0.028384256704883034,
"acc_norm": 0.434640522875817,
"acc_norm_stderr": 0.028384256704883034
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.41479099678456594,
"acc_stderr": 0.027982680459759563,
"acc_norm": 0.41479099678456594,
"acc_norm_stderr": 0.027982680459759563
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.44753086419753085,
"acc_stderr": 0.027667138569422697,
"acc_norm": 0.44753086419753085,
"acc_norm_stderr": 0.027667138569422697
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590634,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590634
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.32659713168187743,
"acc_stderr": 0.011977676704715997,
"acc_norm": 0.32659713168187743,
"acc_norm_stderr": 0.011977676704715997
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.025767252010855956,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.025767252010855956
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.36764705882352944,
"acc_stderr": 0.019506291693954847,
"acc_norm": 0.36764705882352944,
"acc_norm_stderr": 0.019506291693954847
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.38181818181818183,
"acc_stderr": 0.04653429807913508,
"acc_norm": 0.38181818181818183,
"acc_norm_stderr": 0.04653429807913508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.37142857142857144,
"acc_stderr": 0.03093285879278986,
"acc_norm": 0.37142857142857144,
"acc_norm_stderr": 0.03093285879278986
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4427860696517413,
"acc_stderr": 0.03512310964123937,
"acc_norm": 0.4427860696517413,
"acc_norm_stderr": 0.03512310964123937
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.03799857454479636,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.03799857454479636
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.03829509868994727,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.03829509868994727
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2215422276621787,
"mc1_stderr": 0.014537867601301137,
"mc2": 0.3354424646424683,
"mc2_stderr": 0.013418718160544026
},
"harness|winogrande|5": {
"acc": 0.6685082872928176,
"acc_stderr": 0.013230397198964652
},
"harness|gsm8k|5": {
"acc": 0.1607278241091736,
"acc_stderr": 0.010116708586037183
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
HamdanXI/arb-eng-parallel-10k | ---
dataset_info:
features:
- name: arabic
dtype: string
- name: english
dtype: string
splits:
- name: train
num_bytes: 4293258.423270529
num_examples: 10000
download_size: 2378038
dataset_size: 4293258.423270529
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
drublackberry/hbr-coaching-real-leaders | ---
license: mit
---
Transcripts of HBR's Coaching Real Leaders podcasts, can be found [here](https://hbr.org/2020/12/podcast-coaching-real-leaders)
|
subset-data/finetune-data-e4da7017fcce | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 439213.3333333333
num_examples: 56
- name: test
num_bytes: 31372.380952380954
num_examples: 4
- name: valid
num_bytes: 23529.285714285714
num_examples: 3
download_size: 157086
dataset_size: 494115.0
---
# Dataset Card for "finetune-data-e4da7017fcce"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
severo/speech-rj-hi |
---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 3672926800.4989805
num_examples: 422603
- name: test
num_bytes: 36510981.394019544
num_examples: 4269
download_size: 2808288472
dataset_size: 3709437781.893
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
license: mit
task_categories:
- text-to-speech
- automatic-speech-recognition
language:
- hi
pretty_name: Rajasthani Speech Dataset
size_categories:
- 100K<n<1M
---
# Rajasthani Hindi Speech Dataset
<!-- Provide a quick summary of the dataset. -->
This dataset consists of audio recordings of participants reading out stories in Rajasthani Hindi, one sentence at a time. We had 98 participants from Soda, Rajasthan. Each participant read 30 stories. In total, we have 426873 recordings in this dataset. We had roughly 58 male participants and 40 female participants.
> **Point to Note:**
> While random sampling suggests that most users have to their best effort tried to accurately read out the sentences, we have not performed any quality analysis on the data. There could be errors in some of the recordings.
<!-- Provide a longer summary of what this dataset is. -->
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Link:** [Download](https://www.microsoft.com/en-gb/download/details.aspx?id=105385)
- **Curated By:** [Kalika Bali](https://www.microsoft.com/en-us/research/people/kalikab/downloads/)
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
Contains two headers: audio and sentence containing the Audio file and sentence respectively.
|
tuperte69/sdft-test-03 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 303231880.0
num_examples: 172
download_size: 303231499
dataset_size: 303231880.0
---
# Dataset Card for "sdft-test-03"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/13000000_Groups_Man_Machine_Conversation_Interactive_Text_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
Human-machine dialogue interaction textual data, 13 million groups in total. The data is interaction text between the user and the robot. Each line represents a set of interaction text, separated by '|'; this data set can be used for natural language understanding, knowledge base construction etc.
For more details, please refer to the link: https://www.nexdata.ai/dataset/249?source=Huggingface
# Specifications
## Data content
Human-machine dialogue interactive text data
## Data size
13 million sets
## Collecting period
The year 2,017
## Storage format
txt
## Language
Chinese
# Licensing Information
Commercial License
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_20 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1426131480.0
num_examples: 277890
download_size: 1457669325
dataset_size: 1426131480.0
---
# Dataset Card for "chunk_20"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pphuc25/vanmauvip_com | ---
dataset_info:
features:
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 71040692
num_examples: 13390
download_size: 35161324
dataset_size: 71040692
---
# Dataset Card for "vanmauvip_com"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/n102_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of n102/N102/N102/N102 (Nikke: Goddess of Victory)
This is the dataset of n102/N102/N102/N102 (Nikke: Goddess of Victory), containing 39 images and their tags.
The core tags of this character are `bangs, animal_ears, hair_ornament, blue_eyes, white_hair, hair_between_eyes, long_hair, twintails, animal_ear_fluff, hair_bun, cat_ears, double_bun, butterfly_hair_ornament, hairclip`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 39 | 73.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/n102_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 39 | 36.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/n102_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 103 | 86.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/n102_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 39 | 62.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/n102_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 103 | 131.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/n102_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/n102_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 39 |  |  |  |  |  | 1girl, solo, looking_at_viewer, smile, blush, open_mouth, fur_trim, long_sleeves, virtual_youtuber, butterfly, choker, white_jacket, dress |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | smile | blush | open_mouth | fur_trim | long_sleeves | virtual_youtuber | butterfly | choker | white_jacket | dress |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:--------|:-------------|:-----------|:---------------|:-------------------|:------------|:---------|:---------------|:--------|
| 0 | 39 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
autoevaluate/autoeval-eval-tweet_eval-emotion-dbaa98-66233145580 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- tweet_eval
eval_info:
task: multi_class_classification
model: 095ey11/bert-emotion
metrics: []
dataset_name: tweet_eval
dataset_config: emotion
dataset_split: train
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: 095ey11/bert-emotion
* Dataset: tweet_eval
* Config: emotion
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Ayushkm2799](https://huggingface.co/Ayushkm2799) for evaluating this model. |
lleticiasilvaa/defog_wikisql_adaptado | ---
dataset_info:
features:
- name: question
dtype: string
- name: metadata
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 875226
num_examples: 1000
download_size: 324715
dataset_size: 875226
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_BarryFutureman__NeuralLake-Variant1-7B | ---
pretty_name: Evaluation run of BarryFutureman/NeuralLake-Variant1-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BarryFutureman/NeuralLake-Variant1-7B](https://huggingface.co/BarryFutureman/NeuralLake-Variant1-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BarryFutureman__NeuralLake-Variant1-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-23T20:08:48.201286](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__NeuralLake-Variant1-7B/blob/main/results_2024-01-23T20-08-48.201286.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6527879049855548,\n\
\ \"acc_stderr\": 0.032052113329256254,\n \"acc_norm\": 0.652189910746759,\n\
\ \"acc_norm_stderr\": 0.032721608531391104,\n \"mc1\": 0.5483476132190942,\n\
\ \"mc1_stderr\": 0.017421480300277643,\n \"mc2\": 0.6837155338410112,\n\
\ \"mc2_stderr\": 0.015180251006560648\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.01336308010724448,\n\
\ \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.0129550659637107\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7168890659231228,\n\
\ \"acc_stderr\": 0.004495891440519419,\n \"acc_norm\": 0.8844851623182632,\n\
\ \"acc_norm_stderr\": 0.003189889789404668\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933713,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092437,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092437\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156861,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156861\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368985,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368985\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.02344582627654554,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.02344582627654554\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4346368715083799,\n\
\ \"acc_stderr\": 0.01657899743549672,\n \"acc_norm\": 0.4346368715083799,\n\
\ \"acc_norm_stderr\": 0.01657899743549672\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.45697522816166886,\n \"acc_stderr\": 0.012722869501611419,\n\
\ \"acc_norm\": 0.45697522816166886,\n \"acc_norm_stderr\": 0.012722869501611419\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"\
acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507205,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507205\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5483476132190942,\n\
\ \"mc1_stderr\": 0.017421480300277643,\n \"mc2\": 0.6837155338410112,\n\
\ \"mc2_stderr\": 0.015180251006560648\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8445146014206788,\n \"acc_stderr\": 0.010184308214775777\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6929492039423806,\n \
\ \"acc_stderr\": 0.012705685723131709\n }\n}\n```"
repo_url: https://huggingface.co/BarryFutureman/NeuralLake-Variant1-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|arc:challenge|25_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|gsm8k|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hellaswag|10_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T20-08-48.201286.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T20-08-48.201286.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- '**/details_harness|winogrande|5_2024-01-23T20-08-48.201286.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-23T20-08-48.201286.parquet'
- config_name: results
data_files:
- split: 2024_01_23T20_08_48.201286
path:
- results_2024-01-23T20-08-48.201286.parquet
- split: latest
path:
- results_2024-01-23T20-08-48.201286.parquet
---
# Dataset Card for Evaluation run of BarryFutureman/NeuralLake-Variant1-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BarryFutureman/NeuralLake-Variant1-7B](https://huggingface.co/BarryFutureman/NeuralLake-Variant1-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BarryFutureman__NeuralLake-Variant1-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T20:08:48.201286](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__NeuralLake-Variant1-7B/blob/main/results_2024-01-23T20-08-48.201286.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6527879049855548,
"acc_stderr": 0.032052113329256254,
"acc_norm": 0.652189910746759,
"acc_norm_stderr": 0.032721608531391104,
"mc1": 0.5483476132190942,
"mc1_stderr": 0.017421480300277643,
"mc2": 0.6837155338410112,
"mc2_stderr": 0.015180251006560648
},
"harness|arc:challenge|25": {
"acc": 0.7022184300341296,
"acc_stderr": 0.01336308010724448,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.0129550659637107
},
"harness|hellaswag|10": {
"acc": 0.7168890659231228,
"acc_stderr": 0.004495891440519419,
"acc_norm": 0.8844851623182632,
"acc_norm_stderr": 0.003189889789404668
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933713,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092437,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092437
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156861,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156861
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368985,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368985
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.02344582627654554,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.02344582627654554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4346368715083799,
"acc_stderr": 0.01657899743549672,
"acc_norm": 0.4346368715083799,
"acc_norm_stderr": 0.01657899743549672
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45697522816166886,
"acc_stderr": 0.012722869501611419,
"acc_norm": 0.45697522816166886,
"acc_norm_stderr": 0.012722869501611419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507205,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507205
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5483476132190942,
"mc1_stderr": 0.017421480300277643,
"mc2": 0.6837155338410112,
"mc2_stderr": 0.015180251006560648
},
"harness|winogrande|5": {
"acc": 0.8445146014206788,
"acc_stderr": 0.010184308214775777
},
"harness|gsm8k|5": {
"acc": 0.6929492039423806,
"acc_stderr": 0.012705685723131709
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kaist-ai/Perception-Bench | ---
license: cc-by-4.0
task_categories:
- visual-question-answering
- text2text-generation
- image-to-text
language:
- en
size_categories:
- n<1K
---
# Dataset Card
- **Homepage: https://kaistai.github.io/prometheus-vision/**
- **Repository: https://github.com/kaistAI/prometheus-vision**
- **Paper: https://arxiv.org/abs/2401.06591**
- **Point of Contact: seongyun@kaist.ac.kr**
### Dataset summary
Perception-Bench is a benchmark for evaluating the long-form response of a VLM (Vision Language Model) across various domains of images, and it is a held-out test
set of the [Perception-Collection](https://huggingface.co/datasets/kaist-ai/Perception-Collection)

### Languages
English
## Dataset Structure
* image: The path of the images used for training, consisting of images from the MMMU dataset and COCO 2017 train dataset.
* instruction: The input that is given to the evaluator VLM. It includes the instruction & response to evaluate, the reference answer, the score rubric.
* orig```_```instruction: The instruction to be evaluated. Note that this differs with the instruction that includes all the components.
* orig```_```reference```_```answer: A reference answer to the orig```_```instruction.
* orig```_```criteria: The score criteria used to evaluate the orig```_``` response.
* orig```_```score1```_```description: A description of when to give a score of 1 to the orig```_```response.
* orig```_```score2```_```description: A description of when to give a score of 2 to the orig```_```response.
* orig```_```score3```_```description: A description of when to give a score of 3 to the orig```_```response.
* orig```_```score4```_```description: A description of when to give a score of 4 to the orig```_```response.
* orig```_```score5```_```description: A description of when to give a score of 5 to the orig```_```response.
### Data Splits
| name | test |
|-------------------|------:|
|Perception-Bench|500|
### Citation Information
If you find the following benchmark helpful, please consider citing our paper!
```bibtex
@misc{lee2024prometheusvision,
title={Prometheus-Vision: Vision-Language Model as a Judge for Fine-Grained Evaluation},
author={Seongyun Lee and Seungone Kim and Sue Hyun Park and Geewook Kim and Minjoon Seo},
year={2024},
eprint={2401.06591},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
liuyanchen1015/MULTI_VALUE_sst2_linking_relcl | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 12629
num_examples: 77
- name: test
num_bytes: 26101
num_examples: 166
- name: train
num_bytes: 240959
num_examples: 1693
download_size: 152928
dataset_size: 279689
---
# Dataset Card for "MULTI_VALUE_sst2_linking_relcl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gonta888/tsuna_mount_data | ---
license: openrail
---
|
thi8999/DATA1 | ---
dataset_info:
features:
- name: Column 1
dtype: string
- name: target
dtype: string
- name: transformed_text
dtype: string
splits:
- name: train
num_bytes: 2623
num_examples: 7
download_size: 5310
dataset_size: 2623
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/kalina_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Kalina (Girls' Frontline)
This is the dataset of Kalina (Girls' Frontline), containing 242 images and their tags.
The core tags of this character are `long_hair, breasts, blue_eyes, orange_hair, side_ponytail, hair_between_eyes, ribbon, hair_ribbon, large_breasts, eyewear_on_head, sunglasses, bangs, hair_ornament, red_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 242 | 327.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kalina_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 242 | 180.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kalina_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 618 | 409.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kalina_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 242 | 288.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kalina_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 618 | 579.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kalina_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kalina_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, looking_at_viewer, solo, cleavage, simple_background, blush, pleated_skirt, white_shirt, gloves, headset, thighhighs, white_background, smile, belt, black_skirt, black_bra, bow, collarbone, mismatched_legwear, pouch, striped, holding, open_jacket |
| 1 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, cleavage, collarbone, solo, upper_body, white_shirt, simple_background, white_background, smile, glasses, red_necktie |
| 2 | 7 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, solo, black_bikini, black_choker, black_gloves, navel, official_alternate_costume, smile, blush, collarbone, denim_shorts, sun_hat, white_shirt, hairclip, heart, open_mouth, ocean, sandals, short_shorts, simple_background, standing, straw_hat, thigh_strap |
| 3 | 6 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, collarbone, looking_at_viewer, navel, solo, blush, smile, stomach, black_bikini, cowboy_shot, sidelocks, simple_background, white_background, ass_visible_through_thighs, bracelet, groin, hairclip, open_mouth, skindentation, x_hair_ornament |
| 4 | 11 |  |  |  |  |  | 1girl, griffin_&_kryuger_military_uniform, solo, white_shirt, collared_shirt, looking_at_viewer, pantyhose, black_necktie, glasses, long_sleeves, smile, blush, holding, sidelocks, sitting, jacket, off_shoulder, thigh_boots, belt, black_footwear, crossed_legs, red_coat, round_eyewear, thighhighs, closed_mouth, hairband |
| 5 | 12 |  |  |  |  |  | 1girl, blush, hetero, nipples, solo_focus, 1boy, penis, sex, bar_censor, vaginal, navel, spread_legs, thighhighs, nude, open_mouth, cum_in_pussy, heart-shaped_pupils |
| 6 | 6 |  |  |  |  |  | 1boy, 1girl, blush, hetero, solo_focus, penis, smile, breasts_squeezed_together, nipples, paizuri, sweat, white_shirt, ahoge, jacket, male_pubic_hair, mosaic_censoring, nude, open_mouth, open_shirt, red_bowtie, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | cleavage | simple_background | blush | pleated_skirt | white_shirt | gloves | headset | thighhighs | white_background | smile | belt | black_skirt | black_bra | bow | collarbone | mismatched_legwear | pouch | striped | holding | open_jacket | upper_body | glasses | red_necktie | black_bikini | black_choker | black_gloves | navel | official_alternate_costume | denim_shorts | sun_hat | hairclip | heart | open_mouth | ocean | sandals | short_shorts | standing | straw_hat | thigh_strap | bare_shoulders | stomach | cowboy_shot | sidelocks | ass_visible_through_thighs | bracelet | groin | skindentation | x_hair_ornament | griffin_&_kryuger_military_uniform | collared_shirt | pantyhose | black_necktie | long_sleeves | sitting | jacket | off_shoulder | thigh_boots | black_footwear | crossed_legs | red_coat | round_eyewear | closed_mouth | hairband | hetero | nipples | solo_focus | 1boy | penis | sex | bar_censor | vaginal | spread_legs | nude | cum_in_pussy | heart-shaped_pupils | breasts_squeezed_together | paizuri | sweat | ahoge | male_pubic_hair | mosaic_censoring | open_shirt | red_bowtie |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----------|:--------------------|:--------|:----------------|:--------------|:---------|:----------|:-------------|:-------------------|:--------|:-------|:--------------|:------------|:------|:-------------|:---------------------|:--------|:----------|:----------|:--------------|:-------------|:----------|:--------------|:---------------|:---------------|:---------------|:--------|:-----------------------------|:---------------|:----------|:-----------|:--------|:-------------|:--------|:----------|:---------------|:-----------|:------------|:--------------|:-----------------|:----------|:--------------|:------------|:-----------------------------|:-----------|:--------|:----------------|:------------------|:-------------------------------------|:-----------------|:------------|:----------------|:---------------|:----------|:---------|:---------------|:--------------|:-----------------|:---------------|:-----------|:----------------|:---------------|:-----------|:---------|:----------|:-------------|:-------|:--------|:------|:-------------|:----------|:--------------|:-------|:---------------|:----------------------|:----------------------------|:----------|:--------|:--------|:------------------|:-------------------|:-------------|:-------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | | X | | | | X | X | | | | | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | X | X | | X | | | | | X | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | X | X | X | | | | | | X | X | | | | | X | | | | | | | | | X | | | X | | | | X | | X | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 11 |  |  |  |  |  | X | X | X | | | X | | X | | | X | | X | X | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 5 | 12 |  |  |  |  |  | X | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | | | X | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | | | | | X | | | X | X | X | X | X | X | X | X |
|
yihaocs/Video_dataset | ---
license: apache-2.0
---
|
Rodr16020/llama_2_chat_gns3_code | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: code
dtype: string
- name: full_prompt
dtype: string
splits:
- name: train
num_bytes: 2702530
num_examples: 334
download_size: 229678
dataset_size: 2702530
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2_chat_gns3_code"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hetewfwe/colab | ---
license: other
---
|
andersonbcdefg/reward-modeling-short-tokenized | ---
dataset_info:
features:
- name: preferred_input_ids
sequence: int64
- name: preferred_attention_masks
sequence: int64
- name: dispreferred_input_ids
sequence: int64
- name: dispreferred_attention_masks
sequence: int64
splits:
- name: train
num_bytes: 8509513392
num_examples: 259563
download_size: 138519630
dataset_size: 8509513392
---
# Dataset Card for "reward-modeling-short-tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thorirhrafn/rmh_subset_medium | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 707846794
num_examples: 282160
- name: test
num_bytes: 23981399
num_examples: 10000
- name: valid
num_bytes: 3416614
num_examples: 2000
download_size: 448271172
dataset_size: 735244807
---
# Dataset Card for "rmh_subset_medium"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ftopal/huggingface-models-raw | ---
dataset_info:
features:
- name: sha
dtype: 'null'
- name: last_modified
dtype: 'null'
- name: library_name
dtype: string
- name: text
dtype: string
- name: metadata
dtype: string
- name: pipeline_tag
dtype: string
- name: id
dtype: string
- name: tags
sequence: string
- name: created_at
dtype: string
splits:
- name: train
num_bytes: 1975056223
num_examples: 514162
download_size: 1338534125
dataset_size: 1975056223
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/jade_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of jade/ヤーデ/亚德 (Azur Lane)
This is the dataset of jade/ヤーデ/亚德 (Azur Lane), containing 46 images and their tags.
The core tags of this character are `breasts, blue_eyes, bangs, grey_hair, hair_bun, hair_ornament, large_breasts, short_hair, hair_between_eyes, hairclip, hat, double_bun, mole`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 46 | 92.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jade_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 46 | 43.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jade_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 117 | 93.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jade_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 46 | 78.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jade_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 117 | 153.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jade_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/jade_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, looking_at_viewer, solo, popsicle, sailor_collar, bracelet, white_one-piece_swimsuit, blush, bare_shoulders, innertube, water, covered_navel, holding, looking_back, smile |
| 1 | 25 |  |  |  |  |  | 1girl, looking_at_viewer, solo, cleavage, smile, blush, long_sleeves, white_background, simple_background, white_gloves, black_headwear, thigh_strap, black_dress, skirt, cross, mole_under_eye |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | popsicle | sailor_collar | bracelet | white_one-piece_swimsuit | blush | bare_shoulders | innertube | water | covered_navel | holding | looking_back | smile | cleavage | long_sleeves | white_background | simple_background | white_gloves | black_headwear | thigh_strap | black_dress | skirt | cross | mole_under_eye |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----------|:----------------|:-----------|:---------------------------|:--------|:-----------------|:------------|:--------|:----------------|:----------|:---------------|:--------|:-----------|:---------------|:-------------------|:--------------------|:---------------|:-----------------|:--------------|:--------------|:--------|:--------|:-----------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 25 |  |  |  |  |  | X | X | X | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
datahrvoje/twitter_dataset_1713145082 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 20797
num_examples: 46
download_size: 11757
dataset_size: 20797
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_gmonsoon__MaxiCPM-3x3B-Test | ---
pretty_name: Evaluation run of gmonsoon/MaxiCPM-3x3B-Test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gmonsoon/MaxiCPM-3x3B-Test](https://huggingface.co/gmonsoon/MaxiCPM-3x3B-Test)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gmonsoon__MaxiCPM-3x3B-Test\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-20T02:07:46.103681](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__MaxiCPM-3x3B-Test/blob/main/results_2024-02-20T02-07-46.103681.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5279788157337701,\n\
\ \"acc_stderr\": 0.0343045353535822,\n \"acc_norm\": 0.5308215181951149,\n\
\ \"acc_norm_stderr\": 0.035004933070071895,\n \"mc1\": 0.26560587515299877,\n\
\ \"mc1_stderr\": 0.015461027627253597,\n \"mc2\": 0.41058967358987175,\n\
\ \"mc2_stderr\": 0.014565493729989814\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.42150170648464164,\n \"acc_stderr\": 0.014430197069326025,\n\
\ \"acc_norm\": 0.4598976109215017,\n \"acc_norm_stderr\": 0.01456431885692485\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5266879107747461,\n\
\ \"acc_stderr\": 0.00498266845211894,\n \"acc_norm\": 0.7173869747062338,\n\
\ \"acc_norm_stderr\": 0.004493495872000117\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.039993097127774734,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.039993097127774734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5358490566037736,\n \"acc_stderr\": 0.030693675018458003,\n\
\ \"acc_norm\": 0.5358490566037736,\n \"acc_norm_stderr\": 0.030693675018458003\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.03810871630454764,\n\
\ \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.03810871630454764\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n\
\ \"acc_stderr\": 0.04755129616062946,\n \"acc_norm\": 0.35294117647058826,\n\
\ \"acc_norm_stderr\": 0.04755129616062946\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.40425531914893614,\n\
\ \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.40425531914893614,\n\
\ \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.32456140350877194,\n \"acc_stderr\": 0.04404556157374767,\n\
\ \"acc_norm\": 0.32456140350877194,\n \"acc_norm_stderr\": 0.04404556157374767\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n \"acc_norm\"\
: 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36243386243386244,\n\
\ \"acc_stderr\": 0.024757473902752045,\n \"acc_norm\": 0.36243386243386244,\n\
\ \"acc_norm_stderr\": 0.024757473902752045\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.04134913018303316,\n\
\ \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.04134913018303316\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.6451612903225806,\n \"acc_stderr\": 0.02721888977330876,\n\
\ \"acc_norm\": 0.6451612903225806,\n \"acc_norm_stderr\": 0.02721888977330876\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959215,\n \"\
acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959215\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03825460278380025,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03825460278380025\n },\n\
\ \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6717171717171717,\n\
\ \"acc_stderr\": 0.03345678422756775,\n \"acc_norm\": 0.6717171717171717,\n\
\ \"acc_norm_stderr\": 0.03345678422756775\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.03201867122877794,\n\
\ \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.03201867122877794\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.45897435897435895,\n \"acc_stderr\": 0.025265525491284295,\n\
\ \"acc_norm\": 0.45897435897435895,\n \"acc_norm_stderr\": 0.025265525491284295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066485,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236152,\n \
\ \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6844036697247706,\n \"acc_stderr\": 0.01992611751386967,\n \"\
acc_norm\": 0.6844036697247706,\n \"acc_norm_stderr\": 0.01992611751386967\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.033086111132364364,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033086111132364364\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6624472573839663,\n \"acc_stderr\": 0.03078154910202622,\n \
\ \"acc_norm\": 0.6624472573839663,\n \"acc_norm_stderr\": 0.03078154910202622\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.04243869242230524,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.04243869242230524\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n\
\ \"acc_stderr\": 0.047803436269367894,\n \"acc_norm\": 0.5740740740740741,\n\
\ \"acc_norm_stderr\": 0.047803436269367894\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6196319018404908,\n \"acc_stderr\": 0.038142698932618374,\n\
\ \"acc_norm\": 0.6196319018404908,\n \"acc_norm_stderr\": 0.038142698932618374\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764377,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764377\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.04689765937278135,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.04689765937278135\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.02514093595033544,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.02514093595033544\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6743295019157088,\n\
\ \"acc_stderr\": 0.016757989458549675,\n \"acc_norm\": 0.6743295019157088,\n\
\ \"acc_norm_stderr\": 0.016757989458549675\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6098265895953757,\n \"acc_stderr\": 0.026261677607806646,\n\
\ \"acc_norm\": 0.6098265895953757,\n \"acc_norm_stderr\": 0.026261677607806646\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\
\ \"acc_stderr\": 0.014465893829859923,\n \"acc_norm\": 0.24916201117318434,\n\
\ \"acc_norm_stderr\": 0.014465893829859923\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5751633986928104,\n \"acc_stderr\": 0.02830457667314111,\n\
\ \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.02830457667314111\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5787781350482315,\n\
\ \"acc_stderr\": 0.028043399858210624,\n \"acc_norm\": 0.5787781350482315,\n\
\ \"acc_norm_stderr\": 0.028043399858210624\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5679012345679012,\n \"acc_stderr\": 0.027563010971606672,\n\
\ \"acc_norm\": 0.5679012345679012,\n \"acc_norm_stderr\": 0.027563010971606672\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.375886524822695,\n \"acc_stderr\": 0.028893955412115882,\n \
\ \"acc_norm\": 0.375886524822695,\n \"acc_norm_stderr\": 0.028893955412115882\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39960886571056065,\n\
\ \"acc_stderr\": 0.01251018163696068,\n \"acc_norm\": 0.39960886571056065,\n\
\ \"acc_norm_stderr\": 0.01251018163696068\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.39338235294117646,\n \"acc_stderr\": 0.02967428828131118,\n\
\ \"acc_norm\": 0.39338235294117646,\n \"acc_norm_stderr\": 0.02967428828131118\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5130718954248366,\n \"acc_stderr\": 0.020220920829626923,\n \
\ \"acc_norm\": 0.5130718954248366,\n \"acc_norm_stderr\": 0.020220920829626923\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.030932858792789848,\n\
\ \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.030932858792789848\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.030360490154014652,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.030360490154014652\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.033773102522092056,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.033773102522092056\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26560587515299877,\n\
\ \"mc1_stderr\": 0.015461027627253597,\n \"mc2\": 0.41058967358987175,\n\
\ \"mc2_stderr\": 0.014565493729989814\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6685082872928176,\n \"acc_stderr\": 0.013230397198964662\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4488248673237301,\n \
\ \"acc_stderr\": 0.013700157442788076\n }\n}\n```"
repo_url: https://huggingface.co/gmonsoon/MaxiCPM-3x3B-Test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|arc:challenge|25_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|gsm8k|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hellaswag|10_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T02-07-46.103681.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T02-07-46.103681.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- '**/details_harness|winogrande|5_2024-02-20T02-07-46.103681.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-20T02-07-46.103681.parquet'
- config_name: results
data_files:
- split: 2024_02_20T02_07_46.103681
path:
- results_2024-02-20T02-07-46.103681.parquet
- split: latest
path:
- results_2024-02-20T02-07-46.103681.parquet
---
# Dataset Card for Evaluation run of gmonsoon/MaxiCPM-3x3B-Test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gmonsoon/MaxiCPM-3x3B-Test](https://huggingface.co/gmonsoon/MaxiCPM-3x3B-Test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gmonsoon__MaxiCPM-3x3B-Test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-20T02:07:46.103681](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__MaxiCPM-3x3B-Test/blob/main/results_2024-02-20T02-07-46.103681.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5279788157337701,
"acc_stderr": 0.0343045353535822,
"acc_norm": 0.5308215181951149,
"acc_norm_stderr": 0.035004933070071895,
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253597,
"mc2": 0.41058967358987175,
"mc2_stderr": 0.014565493729989814
},
"harness|arc:challenge|25": {
"acc": 0.42150170648464164,
"acc_stderr": 0.014430197069326025,
"acc_norm": 0.4598976109215017,
"acc_norm_stderr": 0.01456431885692485
},
"harness|hellaswag|10": {
"acc": 0.5266879107747461,
"acc_stderr": 0.00498266845211894,
"acc_norm": 0.7173869747062338,
"acc_norm_stderr": 0.004493495872000117
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.039993097127774734,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.039993097127774734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5358490566037736,
"acc_stderr": 0.030693675018458003,
"acc_norm": 0.5358490566037736,
"acc_norm_stderr": 0.030693675018458003
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5625,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.024757473902752045,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.024757473902752045
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.02721888977330876,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.02721888977330876
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6,
"acc_stderr": 0.03825460278380025,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03825460278380025
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6717171717171717,
"acc_stderr": 0.03345678422756775,
"acc_norm": 0.6717171717171717,
"acc_norm_stderr": 0.03345678422756775
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7305699481865285,
"acc_stderr": 0.03201867122877794,
"acc_norm": 0.7305699481865285,
"acc_norm_stderr": 0.03201867122877794
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45897435897435895,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.45897435897435895,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066485,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5798319327731093,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.5798319327731093,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6844036697247706,
"acc_stderr": 0.01992611751386967,
"acc_norm": 0.6844036697247706,
"acc_norm_stderr": 0.01992611751386967
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033086111132364364,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033086111132364364
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6624472573839663,
"acc_stderr": 0.03078154910202622,
"acc_norm": 0.6624472573839663,
"acc_norm_stderr": 0.03078154910202622
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.04243869242230524,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.04243869242230524
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.047803436269367894,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.047803436269367894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6196319018404908,
"acc_stderr": 0.038142698932618374,
"acc_norm": 0.6196319018404908,
"acc_norm_stderr": 0.038142698932618374
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764377,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764377
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.04689765937278135,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.04689765937278135
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.02514093595033544,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.02514093595033544
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6743295019157088,
"acc_stderr": 0.016757989458549675,
"acc_norm": 0.6743295019157088,
"acc_norm_stderr": 0.016757989458549675
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6098265895953757,
"acc_stderr": 0.026261677607806646,
"acc_norm": 0.6098265895953757,
"acc_norm_stderr": 0.026261677607806646
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.014465893829859923,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.014465893829859923
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.02830457667314111,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.02830457667314111
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5787781350482315,
"acc_stderr": 0.028043399858210624,
"acc_norm": 0.5787781350482315,
"acc_norm_stderr": 0.028043399858210624
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5679012345679012,
"acc_stderr": 0.027563010971606672,
"acc_norm": 0.5679012345679012,
"acc_norm_stderr": 0.027563010971606672
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.375886524822695,
"acc_stderr": 0.028893955412115882,
"acc_norm": 0.375886524822695,
"acc_norm_stderr": 0.028893955412115882
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39960886571056065,
"acc_stderr": 0.01251018163696068,
"acc_norm": 0.39960886571056065,
"acc_norm_stderr": 0.01251018163696068
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39338235294117646,
"acc_stderr": 0.02967428828131118,
"acc_norm": 0.39338235294117646,
"acc_norm_stderr": 0.02967428828131118
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5130718954248366,
"acc_stderr": 0.020220920829626923,
"acc_norm": 0.5130718954248366,
"acc_norm_stderr": 0.020220920829626923
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.030932858792789848,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.030932858792789848
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.033773102522092056,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.033773102522092056
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253597,
"mc2": 0.41058967358987175,
"mc2_stderr": 0.014565493729989814
},
"harness|winogrande|5": {
"acc": 0.6685082872928176,
"acc_stderr": 0.013230397198964662
},
"harness|gsm8k|5": {
"acc": 0.4488248673237301,
"acc_stderr": 0.013700157442788076
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Gryphe__MythoLogic-L2-13b | ---
pretty_name: Evaluation run of Gryphe/MythoLogic-L2-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Gryphe/MythoLogic-L2-13b](https://huggingface.co/Gryphe/MythoLogic-L2-13b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gryphe__MythoLogic-L2-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T12:37:06.579153](https://huggingface.co/datasets/open-llm-leaderboard/details_Gryphe__MythoLogic-L2-13b/blob/main/results_2023-09-23T12-37-06.579153.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2177013422818792,\n\
\ \"em_stderr\": 0.004226262781727102,\n \"f1\": 0.2842743288590614,\n\
\ \"f1_stderr\": 0.004232535857485872,\n \"acc\": 0.43918283744411857,\n\
\ \"acc_stderr\": 0.01042943655066695\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2177013422818792,\n \"em_stderr\": 0.004226262781727102,\n\
\ \"f1\": 0.2842743288590614,\n \"f1_stderr\": 0.004232535857485872\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11751326762699014,\n \
\ \"acc_stderr\": 0.008870331256489993\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.760852407261247,\n \"acc_stderr\": 0.011988541844843909\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Gryphe/MythoLogic-L2-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|arc:challenge|25_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T12_37_06.579153
path:
- '**/details_harness|drop|3_2023-09-23T12-37-06.579153.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T12-37-06.579153.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T12_37_06.579153
path:
- '**/details_harness|gsm8k|5_2023-09-23T12-37-06.579153.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T12-37-06.579153.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hellaswag|10_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:05:11.641476.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T11:05:11.641476.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T11:05:11.641476.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T12_37_06.579153
path:
- '**/details_harness|winogrande|5_2023-09-23T12-37-06.579153.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T12-37-06.579153.parquet'
- config_name: results
data_files:
- split: 2023_08_09T11_05_11.641476
path:
- results_2023-08-09T11:05:11.641476.parquet
- split: 2023_09_23T12_37_06.579153
path:
- results_2023-09-23T12-37-06.579153.parquet
- split: latest
path:
- results_2023-09-23T12-37-06.579153.parquet
---
# Dataset Card for Evaluation run of Gryphe/MythoLogic-L2-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Gryphe/MythoLogic-L2-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Gryphe/MythoLogic-L2-13b](https://huggingface.co/Gryphe/MythoLogic-L2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Gryphe__MythoLogic-L2-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T12:37:06.579153](https://huggingface.co/datasets/open-llm-leaderboard/details_Gryphe__MythoLogic-L2-13b/blob/main/results_2023-09-23T12-37-06.579153.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2177013422818792,
"em_stderr": 0.004226262781727102,
"f1": 0.2842743288590614,
"f1_stderr": 0.004232535857485872,
"acc": 0.43918283744411857,
"acc_stderr": 0.01042943655066695
},
"harness|drop|3": {
"em": 0.2177013422818792,
"em_stderr": 0.004226262781727102,
"f1": 0.2842743288590614,
"f1_stderr": 0.004232535857485872
},
"harness|gsm8k|5": {
"acc": 0.11751326762699014,
"acc_stderr": 0.008870331256489993
},
"harness|winogrande|5": {
"acc": 0.760852407261247,
"acc_stderr": 0.011988541844843909
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Isaak-Carter/JOSIE_Wizard_Vicuna_unfiltered_de_with_greetings_70k_v2 | ---
dataset_info:
features:
- name: sample
dtype: string
splits:
- name: train
num_bytes: 174268730
num_examples: 34598
download_size: 83087014
dataset_size: 174268730
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dinaaaaaa/lima_rand_sel_50_preference | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: chosen-rating
dtype: int64
- name: rejected
dtype: string
- name: rejected-rating
dtype: int64
splits:
- name: train
num_bytes: 387946
num_examples: 500
download_size: 67766
dataset_size: 387946
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Tamazight-NLP/tamawalt-n-imZZyann | ---
language:
- zgh
- en
- fr
- ar
pretty_name: Tamawalt N ImZZyann
size_categories:
- n<1K
task_categories:
- automatic-speech-recognition
- text-to-speech
- image-classification
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
galman33/gal_yair_8300_256x256 | ---
dataset_info:
features:
- name: lat
dtype: float64
- name: lon
dtype: float64
- name: country_code
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 805012745.0
num_examples: 8300
download_size: 805035741
dataset_size: 805012745.0
---
# Dataset Card for "gal_yair_8300_256x256"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lionelchg/dolly_creative_writing | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1532046.0564174894
num_examples: 673
- name: test
num_bytes: 81951.94358251058
num_examples: 36
download_size: 1011371
dataset_size: 1613998.0
---
# Dataset Card for "dolly_creative_writing"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adhok/mmm_questions | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1156
num_examples: 7
download_size: 2227
dataset_size: 1156
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "mmm_questions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ASSERT-KTH/repairllama-datasets | ---
task_categories:
- text-generation
configs:
- config_name: ir1xor1
data_files:
- split: train
path: data/ir1xor1/train*
- split: test
path: data/ir1xor1/test*
- config_name: ir1xor3
data_files:
- split: train
path: data/ir1xor3/train*
- split: test
path: data/ir1xor3/test*
- config_name: ir1xor4
data_files:
- split: train
path: data/ir1xor4/train*
- split: test
path: data/ir1xor4/test*
- config_name: ir2xor2
data_files:
- split: train
path: data/ir2xor2/train*
- split: test
path: data/ir2xor2/test*
- config_name: ir3xor2
data_files:
- split: train
path: data/ir3xor2/train*
- split: test
path: data/ir3xor2/test*
- config_name: ir4xor2
data_files:
- split: train
path: data/ir4xor2/train*
- split: test
path: data/ir4xor2/test*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
- name: test
language:
- code
size_categories:
- 10K<n<100K
---
# RepairLLaMA - Datasets
Contains the processed fine-tuning datasets for RepairLLaMA.
## Instructions to explore the dataset
To load the dataset, you must define which revision (i.e., which input/output representation pair) you want to load.
```python
from datasets import load_dataset
# Load ir1xor1
dataset = load_dataset("ASSERT-KTH/repairllama-datasets", "ir1xor1")
# Load irXxorY
dataset = load_dataset("ASSERT-KTH/repairllama-dataset", "irXxorY")
```
## Citation
If you use RepairLLaMA in academic research, please cite "[RepairLLaMA: Efficient Representations and Fine-Tuned Adapters for Program Repair](http://arxiv.org/abs/2312.15698)", Technical report, arXiv 2312.15698, 2023.
```bibtex
@techreport{repairllama2023,
title={RepairLLaMA: Efficient Representations and Fine-Tuned Adapters for Program Repair},
author={Silva, Andr{\'e} and Fang, Sen and Monperrus, Martin},
url = {http://arxiv.org/abs/2312.15698},
number = {2312.15698},
institution = {arXiv},
}
``` |
freddyaboulton/new_saving_csv_8 | ---
configs:
- config_name: default
data_files:
- split: train
path: "*.csv"
dataset_info:
features:
- name: Chatbot
dtype: string
_type: Value
- name: Image
dtype: string
_type: Value
- name: Image file
dtype: Image
- name: flag
dtype: string
_type: Value
- name: flag
dtype: string
_type: Value
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/nonomura_sora_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nonomura_sora/野々村そら (THE iDOLM@STER: Cinderella Girls)
This is the dataset of nonomura_sora/野々村そら (THE iDOLM@STER: Cinderella Girls), containing 61 images and their tags.
The core tags of this character are `long_hair, green_eyes, breasts, twintails, black_hair, brown_hair, drill_hair, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 61 | 83.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nonomura_sora_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 61 | 47.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nonomura_sora_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 145 | 101.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nonomura_sora_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 61 | 71.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nonomura_sora_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 145 | 145.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nonomura_sora_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nonomura_sora_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, midriff, navel, open_mouth, smile, solo, looking_at_viewer, medium_breasts, one_eye_closed, skirt, cleavage, earrings, ;d, blush, microphone, necklace, bracelet |
| 1 | 5 |  |  |  |  |  | 1girl, card_(medium), character_name, open_mouth, smile, solo, sun_symbol, star_(symbol), ;d, one_eye_closed, orange_background, skirt, bow, dress, microphone, necklace, sparkle |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | midriff | navel | open_mouth | smile | solo | looking_at_viewer | medium_breasts | one_eye_closed | skirt | cleavage | earrings | ;d | blush | microphone | necklace | bracelet | card_(medium) | character_name | sun_symbol | star_(symbol) | orange_background | bow | dress | sparkle |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:--------|:-------------|:--------|:-------|:--------------------|:-----------------|:-----------------|:--------|:-----------|:-----------|:-----|:--------|:-------------|:-----------|:-----------|:----------------|:-----------------|:-------------|:----------------|:--------------------|:------|:--------|:----------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | | X | X | X | | | X | X | | | X | | X | X | | X | X | X | X | X | X | X | X |
|
yardeny/mlm_test_set_context_len_128 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 499200
num_examples: 640
download_size: 183124
dataset_size: 499200
---
# Dataset Card for "loss_landscape_test_set_context_len_128"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alvations/c4p0-v1-en-de | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: target_backto_source
dtype: string
- name: raw_target
list:
- name: generated_text
dtype: string
- name: raw_target_backto_source
list:
- name: generated_text
dtype: string
- name: prompt
dtype: string
- name: reverse_prompt
dtype: string
- name: source_langid
dtype: string
- name: target_langid
dtype: string
- name: target_backto_source_langid
dtype: string
- name: doc_id
dtype: int64
- name: sent_id
dtype: int64
- name: timestamp
dtype: string
- name: url
dtype: string
- name: doc_hash
dtype: string
- name: dataset
dtype: string
- name: source_lang
dtype: string
- name: target_lang
dtype: string
splits:
- name: train
num_bytes: 14282882
num_examples: 11920
download_size: 6534015
dataset_size: 14282882
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DeepFoldProtein/foldseek_combined_processed_BPE500_512 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: special_tokens_mask
sequence: int8
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 4694657792
num_examples: 653488
download_size: 763584199
dataset_size: 4694657792
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Sid103/Covid23 | ---
dataset_info:
features:
- name: id
dtype: string
- name: context
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 48653509
num_examples: 1417
- name: test
num_bytes: 11608421
num_examples: 375
- name: valid
num_bytes: 4314598
num_examples: 203
download_size: 2241429
dataset_size: 64576528
---
# Dataset Card for "Covid23"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AIARTCHAN/storage | ---
license: creativeml-openrail-m
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.