datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2.5 | ---
pretty_name: Evaluation run of Locutusque/TinyMistral-248M-v2.5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Locutusque/TinyMistral-248M-v2.5](https://huggingface.co/Locutusque/TinyMistral-248M-v2.5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2.5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-24T05:25:23.992452](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2.5/blob/main/results_2024-01-24T05-25-23.992452.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23183324101327804,\n\
\ \"acc_stderr\": 0.029931781472101626,\n \"acc_norm\": 0.23247087577232647,\n\
\ \"acc_norm_stderr\": 0.030724244636163932,\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156496,\n \"mc2\": 0.4671579728496129,\n\
\ \"mc2_stderr\": 0.015912807716045203\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.19965870307167236,\n \"acc_stderr\": 0.011681625756888676,\n\
\ \"acc_norm\": 0.24573378839590443,\n \"acc_norm_stderr\": 0.012581033453730107\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2687711611232822,\n\
\ \"acc_stderr\": 0.004424146562746121,\n \"acc_norm\": 0.2749452300338578,\n\
\ \"acc_norm_stderr\": 0.004455741817861901\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17037037037037037,\n\
\ \"acc_stderr\": 0.032477811859955935,\n \"acc_norm\": 0.17037037037037037,\n\
\ \"acc_norm_stderr\": 0.032477811859955935\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.20394736842105263,\n \"acc_stderr\": 0.03279000406310053,\n\
\ \"acc_norm\": 0.20394736842105263,\n \"acc_norm_stderr\": 0.03279000406310053\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.025604233470899095,\n\
\ \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.025604233470899095\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080339,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080339\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.03186209851641144,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.03186209851641144\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.036001056927277716,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.036001056927277716\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23809523809523808,\n \"acc_stderr\": 0.021935878081184763,\n \"\
acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.021935878081184763\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.039325376803928724,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.039325376803928724\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.18387096774193548,\n \"acc_stderr\": 0.022037217340267833,\n \"\
acc_norm\": 0.18387096774193548,\n \"acc_norm_stderr\": 0.022037217340267833\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.1921182266009852,\n \"acc_stderr\": 0.027719315709614802,\n \"\
acc_norm\": 0.1921182266009852,\n \"acc_norm_stderr\": 0.027719315709614802\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.15656565656565657,\n \"acc_stderr\": 0.025890520358141454,\n \"\
acc_norm\": 0.15656565656565657,\n \"acc_norm_stderr\": 0.025890520358141454\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2076923076923077,\n \"acc_stderr\": 0.020567539567246797,\n\
\ \"acc_norm\": 0.2076923076923077,\n \"acc_norm_stderr\": 0.020567539567246797\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22592592592592592,\n \"acc_stderr\": 0.02549753263960955,\n \
\ \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.02549753263960955\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.17218543046357615,\n \"acc_stderr\": 0.030826136961962396,\n \"\
acc_norm\": 0.17218543046357615,\n \"acc_norm_stderr\": 0.030826136961962396\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1871559633027523,\n \"acc_stderr\": 0.016722684526200154,\n \"\
acc_norm\": 0.1871559633027523,\n \"acc_norm_stderr\": 0.016722684526200154\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1388888888888889,\n \"acc_stderr\": 0.02358544736890014,\n \"\
acc_norm\": 0.1388888888888889,\n \"acc_norm_stderr\": 0.02358544736890014\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n\
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3004484304932735,\n\
\ \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.3004484304932735,\n\
\ \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"\
acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3076923076923077,\n\
\ \"acc_stderr\": 0.030236389942173106,\n \"acc_norm\": 0.3076923076923077,\n\
\ \"acc_norm_stderr\": 0.030236389942173106\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23499361430395913,\n\
\ \"acc_stderr\": 0.01516202415227844,\n \"acc_norm\": 0.23499361430395913,\n\
\ \"acc_norm_stderr\": 0.01516202415227844\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912255,\n\
\ \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912255\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.18971061093247588,\n\
\ \"acc_stderr\": 0.02226819625878323,\n \"acc_norm\": 0.18971061093247588,\n\
\ \"acc_norm_stderr\": 0.02226819625878323\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.023468429832451156,\n\
\ \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.023468429832451156\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.02525786135943242,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.02525786135943242\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24511082138200782,\n\
\ \"acc_stderr\": 0.010986307870045517,\n \"acc_norm\": 0.24511082138200782,\n\
\ \"acc_norm_stderr\": 0.010986307870045517\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.19117647058823528,\n \"acc_stderr\": 0.02388688192244034,\n\
\ \"acc_norm\": 0.19117647058823528,\n \"acc_norm_stderr\": 0.02388688192244034\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072775,\n\
\ \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072775\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17959183673469387,\n\
\ \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.17959183673469387,\n\
\ \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.29239766081871343,\n\
\ \"acc_stderr\": 0.03488647713457921,\n \"acc_norm\": 0.29239766081871343,\n\
\ \"acc_norm_stderr\": 0.03488647713457921\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.25458996328029376,\n \"mc1_stderr\": 0.015250117079156496,\n\
\ \"mc2\": 0.4671579728496129,\n \"mc2_stderr\": 0.015912807716045203\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.47829518547750594,\n\
\ \"acc_stderr\": 0.01403923921648463\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Locutusque/TinyMistral-248M-v2.5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|arc:challenge|25_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|gsm8k|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hellaswag|10_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T05-25-23.992452.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T05-25-23.992452.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- '**/details_harness|winogrande|5_2024-01-24T05-25-23.992452.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-24T05-25-23.992452.parquet'
- config_name: results
data_files:
- split: 2024_01_24T05_25_23.992452
path:
- results_2024-01-24T05-25-23.992452.parquet
- split: latest
path:
- results_2024-01-24T05-25-23.992452.parquet
---
# Dataset Card for Evaluation run of Locutusque/TinyMistral-248M-v2.5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Locutusque/TinyMistral-248M-v2.5](https://huggingface.co/Locutusque/TinyMistral-248M-v2.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-24T05:25:23.992452](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2.5/blob/main/results_2024-01-24T05-25-23.992452.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23183324101327804,
"acc_stderr": 0.029931781472101626,
"acc_norm": 0.23247087577232647,
"acc_norm_stderr": 0.030724244636163932,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156496,
"mc2": 0.4671579728496129,
"mc2_stderr": 0.015912807716045203
},
"harness|arc:challenge|25": {
"acc": 0.19965870307167236,
"acc_stderr": 0.011681625756888676,
"acc_norm": 0.24573378839590443,
"acc_norm_stderr": 0.012581033453730107
},
"harness|hellaswag|10": {
"acc": 0.2687711611232822,
"acc_stderr": 0.004424146562746121,
"acc_norm": 0.2749452300338578,
"acc_norm_stderr": 0.004455741817861901
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17037037037037037,
"acc_stderr": 0.032477811859955935,
"acc_norm": 0.17037037037037037,
"acc_norm_stderr": 0.032477811859955935
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.20394736842105263,
"acc_stderr": 0.03279000406310053,
"acc_norm": 0.20394736842105263,
"acc_norm_stderr": 0.03279000406310053
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22264150943396227,
"acc_stderr": 0.025604233470899095,
"acc_norm": 0.22264150943396227,
"acc_norm_stderr": 0.025604233470899095
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080339,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080339
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641144,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641144
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.036001056927277716,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.036001056927277716
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.021935878081184763,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.021935878081184763
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.039325376803928724,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.039325376803928724
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.18387096774193548,
"acc_stderr": 0.022037217340267833,
"acc_norm": 0.18387096774193548,
"acc_norm_stderr": 0.022037217340267833
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1921182266009852,
"acc_stderr": 0.027719315709614802,
"acc_norm": 0.1921182266009852,
"acc_norm_stderr": 0.027719315709614802
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.15656565656565657,
"acc_stderr": 0.025890520358141454,
"acc_norm": 0.15656565656565657,
"acc_norm_stderr": 0.025890520358141454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2076923076923077,
"acc_stderr": 0.020567539567246797,
"acc_norm": 0.2076923076923077,
"acc_norm_stderr": 0.020567539567246797
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.02549753263960955,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.02549753263960955
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.17218543046357615,
"acc_stderr": 0.030826136961962396,
"acc_norm": 0.17218543046357615,
"acc_norm_stderr": 0.030826136961962396
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1871559633027523,
"acc_stderr": 0.016722684526200154,
"acc_norm": 0.1871559633027523,
"acc_norm_stderr": 0.016722684526200154
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1388888888888889,
"acc_stderr": 0.02358544736890014,
"acc_norm": 0.1388888888888889,
"acc_norm_stderr": 0.02358544736890014
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3004484304932735,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.3004484304932735,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3076923076923077,
"acc_stderr": 0.030236389942173106,
"acc_norm": 0.3076923076923077,
"acc_norm_stderr": 0.030236389942173106
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23499361430395913,
"acc_stderr": 0.01516202415227844,
"acc_norm": 0.23499361430395913,
"acc_norm_stderr": 0.01516202415227844
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912255,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912255
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.18971061093247588,
"acc_stderr": 0.02226819625878323,
"acc_norm": 0.18971061093247588,
"acc_norm_stderr": 0.02226819625878323
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.023468429832451156,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.023468429832451156
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.02525786135943242,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.02525786135943242
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24511082138200782,
"acc_stderr": 0.010986307870045517,
"acc_norm": 0.24511082138200782,
"acc_norm_stderr": 0.010986307870045517
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19117647058823528,
"acc_stderr": 0.02388688192244034,
"acc_norm": 0.19117647058823528,
"acc_norm_stderr": 0.02388688192244034
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072775,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.03488647713457921,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.03488647713457921
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156496,
"mc2": 0.4671579728496129,
"mc2_stderr": 0.015912807716045203
},
"harness|winogrande|5": {
"acc": 0.47829518547750594,
"acc_stderr": 0.01403923921648463
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
eunbinni/ola_llama2_13B_t0_data | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1488820093
num_examples: 1185577
download_size: 856591874
dataset_size: 1488820093
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ola_llama2_13B_t0_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FINNUMBER/FINCH_TRAIN_QA_MCQA_400_NEWFORMAT | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1710009
num_examples: 400
download_size: 935256
dataset_size: 1710009
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DiegoAlfredo/Confiabilidad | ---
license: other
---
|
ShrinivasSK/en_hi_3 | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: src
dtype: string
- name: tgt
dtype: string
splits:
- name: train
num_bytes: 6423499.8
num_examples: 18000
- name: test
num_bytes: 713722.2
num_examples: 2000
download_size: 3835153
dataset_size: 7137222.0
---
# Dataset Card for "data_hi_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
johannes-garstenauer/embeddings_from_distilbert_masking_heaps | ---
dataset_info:
features:
- name: struct
dtype: string
- name: label
dtype: int64
- name: pred
dtype: int64
- name: cls_layer_6
sequence: float32
- name: cls_layer_5
sequence: float32
- name: cls_layer_4
sequence: float32
splits:
- name: train
num_bytes: 2564388529
num_examples: 269087
download_size: 2984971174
dataset_size: 2564388529
---
# Dataset Card for "embeddings_from_distilbert_masking_heaps"
Dataset created for thesis: "Generating Robust Representations of Structures in OpenSSH Heap Dumps" by Johannes Garstenauer.
This dataset contains representations of heap data structures along with their labels and the predicted label.
The representations are the [CLS] token embeddings of the last 3 layers of the DistilBERT model.
The representation-generating model is: https://huggingface.co/johannes-garstenauer/distilbert_masking_heaps
The dataset from which representations were generated is: https://huggingface.co/datasets/johannes-garstenauer/structs_token_size_4_reduced_labelled_eval
The twin dataset (model with different training used to generate embeddings) is: https://huggingface.co/datasets/johannes-garstenauer/embeddings_from_distilbert_class_heaps/
Thesis and associated scripts: https://zenodo.org/records/10053730 |
TeeA/Vietnamese-Chart-Dataset-v2 | ---
dataset_info:
features:
- name: title
dtype: string
- name: x_title
dtype: string
- name: y_title
dtype: string
- name: x
dtype: string
- name: y
dtype: string
- name: file_name
dtype: string
- name: chart_type
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 139868845.7142857
num_examples: 6000
- name: test
num_bytes: 12104661.142857144
num_examples: 500
- name: validation
num_bytes: 11590597.142857144
num_examples: 500
download_size: 116203614
dataset_size: 163564103.99999997
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
Thanmay/hellaswag-translated | ---
dataset_info:
config_name: hi
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
- name: ctx_a
dtype: string
- name: endings
sequence: string
splits:
- name: test
num_bytes: 23537233
num_examples: 10003
- name: validation
num_bytes: 24445725
num_examples: 10042
download_size: 18388972
dataset_size: 47982958
configs:
- config_name: hi
data_files:
- split: test
path: hi/test-*
- split: validation
path: hi/validation-*
---
|
open-llm-leaderboard/details_juhwanlee__gemma-7B-alpaca-case-1-3 | ---
pretty_name: Evaluation run of juhwanlee/gemma-7B-alpaca-case-1-3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [juhwanlee/gemma-7B-alpaca-case-1-3](https://huggingface.co/juhwanlee/gemma-7B-alpaca-case-1-3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_juhwanlee__gemma-7B-alpaca-case-1-3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-27T18:24:17.772209](https://huggingface.co/datasets/open-llm-leaderboard/details_juhwanlee__gemma-7B-alpaca-case-1-3/blob/main/results_2024-03-27T18-24-17.772209.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24061707021253761,\n\
\ \"acc_stderr\": 0.030320955641687266,\n \"acc_norm\": 0.24101574816643948,\n\
\ \"acc_norm_stderr\": 0.03112108665728224,\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474202,\n \"mc2\": 0.49050212765119683,\n\
\ \"mc2_stderr\": 0.016659390577171307\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.22098976109215018,\n \"acc_stderr\": 0.012124929206818258,\n\
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.012653835621466646\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2574188408683529,\n\
\ \"acc_stderr\": 0.004363185172047171,\n \"acc_norm\": 0.2742481577375025,\n\
\ \"acc_norm_stderr\": 0.00445222854104355\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.037857144650666544,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.037857144650666544\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.20754716981132076,\n \"acc_stderr\": 0.024959918028911274,\n\
\ \"acc_norm\": 0.20754716981132076,\n \"acc_norm_stderr\": 0.024959918028911274\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749895,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749895\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.24680851063829787,\n \"acc_stderr\": 0.02818544130123409,\n\
\ \"acc_norm\": 0.24680851063829787,\n \"acc_norm_stderr\": 0.02818544130123409\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.040493392977481404,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.040493392977481404\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.03664666337225256,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.03664666337225256\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.02226181769240017,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.02226181769240017\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\
\ \"acc_stderr\": 0.03395490020856112,\n \"acc_norm\": 0.1746031746031746,\n\
\ \"acc_norm_stderr\": 0.03395490020856112\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25806451612903225,\n\
\ \"acc_stderr\": 0.024892469172462822,\n \"acc_norm\": 0.25806451612903225,\n\
\ \"acc_norm_stderr\": 0.024892469172462822\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.03127090713297697,\n\
\ \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.03127090713297697\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.03192271569548299,\n\
\ \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.03192271569548299\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.03191178226713549,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.03191178226713549\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178256,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178256\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.21025641025641026,\n \"acc_stderr\": 0.020660597485026924,\n\
\ \"acc_norm\": 0.21025641025641026,\n \"acc_norm_stderr\": 0.020660597485026924\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.21851851851851853,\n \"acc_stderr\": 0.025195752251823786,\n \
\ \"acc_norm\": 0.21851851851851853,\n \"acc_norm_stderr\": 0.025195752251823786\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.028510251512341933,\n\
\ \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.028510251512341933\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008936,\n \"\
acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008936\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22935779816513763,\n \"acc_stderr\": 0.018025349724618688,\n \"\
acc_norm\": 0.22935779816513763,\n \"acc_norm_stderr\": 0.018025349724618688\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1712962962962963,\n \"acc_stderr\": 0.025695341643824688,\n \"\
acc_norm\": 0.1712962962962963,\n \"acc_norm_stderr\": 0.025695341643824688\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604243,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604243\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.23766816143497757,\n\
\ \"acc_stderr\": 0.02856807946471426,\n \"acc_norm\": 0.23766816143497757,\n\
\ \"acc_norm_stderr\": 0.02856807946471426\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.034624199316156234,\n\
\ \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.034624199316156234\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23931623931623933,\n\
\ \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.23931623931623933,\n\
\ \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26436781609195403,\n\
\ \"acc_stderr\": 0.01576998484069052,\n \"acc_norm\": 0.26436781609195403,\n\
\ \"acc_norm_stderr\": 0.01576998484069052\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.27167630057803466,\n \"acc_stderr\": 0.02394851290546837,\n\
\ \"acc_norm\": 0.27167630057803466,\n \"acc_norm_stderr\": 0.02394851290546837\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.024630048979824768,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.024630048979824768\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2315112540192926,\n\
\ \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.2315112540192926,\n\
\ \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.023016705640262203,\n\
\ \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.023016705640262203\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2375886524822695,\n \"acc_stderr\": 0.025389512552729906,\n \
\ \"acc_norm\": 0.2375886524822695,\n \"acc_norm_stderr\": 0.025389512552729906\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.19852941176470587,\n \"acc_stderr\": 0.024231013370541083,\n\
\ \"acc_norm\": 0.19852941176470587,\n \"acc_norm_stderr\": 0.024231013370541083\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2761437908496732,\n \"acc_stderr\": 0.018087276935663137,\n \
\ \"acc_norm\": 0.2761437908496732,\n \"acc_norm_stderr\": 0.018087276935663137\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.15454545454545454,\n\
\ \"acc_stderr\": 0.03462262571262667,\n \"acc_norm\": 0.15454545454545454,\n\
\ \"acc_norm_stderr\": 0.03462262571262667\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n\
\ \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2469879518072289,\n\
\ \"acc_stderr\": 0.03357351982064536,\n \"acc_norm\": 0.2469879518072289,\n\
\ \"acc_norm_stderr\": 0.03357351982064536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474202,\n \"mc2\": 0.49050212765119683,\n\
\ \"mc2_stderr\": 0.016659390577171307\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5035516969218626,\n \"acc_stderr\": 0.014052131146915853\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/juhwanlee/gemma-7B-alpaca-case-1-3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|arc:challenge|25_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|gsm8k|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hellaswag|10_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-24-17.772209.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T18-24-17.772209.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- '**/details_harness|winogrande|5_2024-03-27T18-24-17.772209.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-27T18-24-17.772209.parquet'
- config_name: results
data_files:
- split: 2024_03_27T18_24_17.772209
path:
- results_2024-03-27T18-24-17.772209.parquet
- split: latest
path:
- results_2024-03-27T18-24-17.772209.parquet
---
# Dataset Card for Evaluation run of juhwanlee/gemma-7B-alpaca-case-1-3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [juhwanlee/gemma-7B-alpaca-case-1-3](https://huggingface.co/juhwanlee/gemma-7B-alpaca-case-1-3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_juhwanlee__gemma-7B-alpaca-case-1-3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-27T18:24:17.772209](https://huggingface.co/datasets/open-llm-leaderboard/details_juhwanlee__gemma-7B-alpaca-case-1-3/blob/main/results_2024-03-27T18-24-17.772209.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24061707021253761,
"acc_stderr": 0.030320955641687266,
"acc_norm": 0.24101574816643948,
"acc_norm_stderr": 0.03112108665728224,
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474202,
"mc2": 0.49050212765119683,
"mc2_stderr": 0.016659390577171307
},
"harness|arc:challenge|25": {
"acc": 0.22098976109215018,
"acc_stderr": 0.012124929206818258,
"acc_norm": 0.25,
"acc_norm_stderr": 0.012653835621466646
},
"harness|hellaswag|10": {
"acc": 0.2574188408683529,
"acc_stderr": 0.004363185172047171,
"acc_norm": 0.2742481577375025,
"acc_norm_stderr": 0.00445222854104355
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.037857144650666544,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.037857144650666544
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.20754716981132076,
"acc_stderr": 0.024959918028911274,
"acc_norm": 0.20754716981132076,
"acc_norm_stderr": 0.024959918028911274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749895,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749895
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.24680851063829787,
"acc_stderr": 0.02818544130123409,
"acc_norm": 0.24680851063829787,
"acc_norm_stderr": 0.02818544130123409
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481404,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481404
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.03664666337225256,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.03664666337225256
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.02226181769240017,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.02226181769240017
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.03395490020856112,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.03395490020856112
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25806451612903225,
"acc_stderr": 0.024892469172462822,
"acc_norm": 0.25806451612903225,
"acc_norm_stderr": 0.024892469172462822
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.03127090713297697,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.03127090713297697
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.03192271569548299,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.03192271569548299
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.03191178226713549,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.03191178226713549
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178256,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178256
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21025641025641026,
"acc_stderr": 0.020660597485026924,
"acc_norm": 0.21025641025641026,
"acc_norm_stderr": 0.020660597485026924
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21851851851851853,
"acc_stderr": 0.025195752251823786,
"acc_norm": 0.21851851851851853,
"acc_norm_stderr": 0.025195752251823786
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2605042016806723,
"acc_stderr": 0.028510251512341933,
"acc_norm": 0.2605042016806723,
"acc_norm_stderr": 0.028510251512341933
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008936,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22935779816513763,
"acc_stderr": 0.018025349724618688,
"acc_norm": 0.22935779816513763,
"acc_norm_stderr": 0.018025349724618688
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1712962962962963,
"acc_stderr": 0.025695341643824688,
"acc_norm": 0.1712962962962963,
"acc_norm_stderr": 0.025695341643824688
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604243,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604243
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.23766816143497757,
"acc_stderr": 0.02856807946471426,
"acc_norm": 0.23766816143497757,
"acc_norm_stderr": 0.02856807946471426
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.034624199316156234,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.034624199316156234
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578729,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578729
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23931623931623933,
"acc_stderr": 0.027951826808924333,
"acc_norm": 0.23931623931623933,
"acc_norm_stderr": 0.027951826808924333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26436781609195403,
"acc_stderr": 0.01576998484069052,
"acc_norm": 0.26436781609195403,
"acc_norm_stderr": 0.01576998484069052
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.27167630057803466,
"acc_stderr": 0.02394851290546837,
"acc_norm": 0.27167630057803466,
"acc_norm_stderr": 0.02394851290546837
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.024630048979824768,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.024630048979824768
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2315112540192926,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.2315112540192926,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.023016705640262203,
"acc_norm": 0.2191358024691358,
"acc_norm_stderr": 0.023016705640262203
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2375886524822695,
"acc_stderr": 0.025389512552729906,
"acc_norm": 0.2375886524822695,
"acc_norm_stderr": 0.025389512552729906
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19852941176470587,
"acc_stderr": 0.024231013370541083,
"acc_norm": 0.19852941176470587,
"acc_norm_stderr": 0.024231013370541083
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2761437908496732,
"acc_stderr": 0.018087276935663137,
"acc_norm": 0.2761437908496732,
"acc_norm_stderr": 0.018087276935663137
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.15454545454545454,
"acc_stderr": 0.03462262571262667,
"acc_norm": 0.15454545454545454,
"acc_norm_stderr": 0.03462262571262667
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2469879518072289,
"acc_stderr": 0.03357351982064536,
"acc_norm": 0.2469879518072289,
"acc_norm_stderr": 0.03357351982064536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474202,
"mc2": 0.49050212765119683,
"mc2_stderr": 0.016659390577171307
},
"harness|winogrande|5": {
"acc": 0.5035516969218626,
"acc_stderr": 0.014052131146915853
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Sao10K__Mythical-Destroyer-V2-L2-13B | ---
pretty_name: Evaluation run of Sao10K/Mythical-Destroyer-V2-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Mythical-Destroyer-V2-L2-13B](https://huggingface.co/Sao10K/Mythical-Destroyer-V2-L2-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Mythical-Destroyer-V2-L2-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T00:31:39.922453](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Mythical-Destroyer-V2-L2-13B/blob/main/results_2023-09-17T00-31-39.922453.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004089765100671141,\n\
\ \"em_stderr\": 0.0006535802669912855,\n \"f1\": 0.10332739093959775,\n\
\ \"f1_stderr\": 0.00186450066100098,\n \"acc\": 0.3737174427782163,\n\
\ \"acc_stderr\": 0.0061055742246970525\n },\n \"harness|drop|3\":\
\ {\n \"em\": 0.004089765100671141,\n \"em_stderr\": 0.0006535802669912855,\n\
\ \"f1\": 0.10332739093959775,\n \"f1_stderr\": 0.00186450066100098\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7474348855564326,\n\
\ \"acc_stderr\": 0.012211148449394105\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Mythical-Destroyer-V2-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T00_31_39.922453
path:
- '**/details_harness|drop|3_2023-09-17T00-31-39.922453.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T00-31-39.922453.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T00_31_39.922453
path:
- '**/details_harness|gsm8k|5_2023-09-17T00-31-39.922453.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T00-31-39.922453.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T00_31_39.922453
path:
- '**/details_harness|winogrande|5_2023-09-17T00-31-39.922453.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T00-31-39.922453.parquet'
- config_name: results
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- results_2023-08-29T20:53:13.636307.parquet
- split: 2023_09_17T00_31_39.922453
path:
- results_2023-09-17T00-31-39.922453.parquet
- split: latest
path:
- results_2023-09-17T00-31-39.922453.parquet
---
# Dataset Card for Evaluation run of Sao10K/Mythical-Destroyer-V2-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Mythical-Destroyer-V2-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Mythical-Destroyer-V2-L2-13B](https://huggingface.co/Sao10K/Mythical-Destroyer-V2-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Mythical-Destroyer-V2-L2-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T00:31:39.922453](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Mythical-Destroyer-V2-L2-13B/blob/main/results_2023-09-17T00-31-39.922453.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.004089765100671141,
"em_stderr": 0.0006535802669912855,
"f1": 0.10332739093959775,
"f1_stderr": 0.00186450066100098,
"acc": 0.3737174427782163,
"acc_stderr": 0.0061055742246970525
},
"harness|drop|3": {
"em": 0.004089765100671141,
"em_stderr": 0.0006535802669912855,
"f1": 0.10332739093959775,
"f1_stderr": 0.00186450066100098
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.7474348855564326,
"acc_stderr": 0.012211148449394105
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
player1537/Bloom-560m-trained-on-Wizard-Vicuna-Uncensored-trained-on-Based | ---
dataset_info:
features:
- name: text
dtype: string
- name: tokens
sequence: int64
splits:
- name: train
num_bytes: 1512752
num_examples: 120
download_size: 323831
dataset_size: 1512752
---
# Dataset Card for "Bloom-560m-trained-on-Wizard-Vicuna-Uncensored-trained-on-Based"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_29 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 872660676.0
num_examples: 170043
download_size: 893278778
dataset_size: 872660676.0
---
# Dataset Card for "chunk_29"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mrpc_say_complementizer | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 149248
num_examples: 516
- name: train
num_bytes: 322608
num_examples: 1117
- name: validation
num_bytes: 39947
num_examples: 137
download_size: 340070
dataset_size: 511803
---
# Dataset Card for "MULTI_VALUE_mrpc_say_complementizer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
milsunone/cural-functionary-small-5000 | ---
dataset_info:
features:
- name: attention_mask
sequence: int64
- name: input_ids
sequence: int64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 269186916
num_examples: 2243
download_size: 4405162
dataset_size: 269186916
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vannarathp/segmented-kptimes | ---
license: mit
---
|
anumafzal94/pubmed-2048 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: text
dtype: string
- name: summary
dtype: string
splits:
- name: test
num_bytes: 126100060
num_examples: 6592
- name: train
num_bytes: 96394526.44590327
num_examples: 5000
- name: validation
num_bytes: 19355274.16145754
num_examples: 1005
download_size: 78306070
dataset_size: 241849860.6073608
---
# Dataset Card for "pubmed-2048"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
golaxy/KnowCoder-Schema-Following-Data | ---
configs:
- config_name: default
data_files:
- split: schema_following_ner
path: instruction-tuning/NER/train-prompt.json
- split: schema_following_re
path: instruction-tuning/RE/train-prompt.json
- split: schema_following_ed
path: instruction-tuning/ED/train-prompt.json
- split: schema_following_eae
path: instruction-tuning/EAE/train-prompt.json
license: apache-2.0
language:
- en
tags:
- instruction tuning
size_categories:
- 100K<n<1M
---
<p align="center">
<img src="https://github.com/ICT-GoKnow/ict-goknow.github.io/blob/main/knowcoder/static/images/logo.png?raw=true" width="80">
</p>
<h1 align="center"> KnowCoder: Coding Structured Knowledge into LLMs for Universal
Information Extraction </h1>
<p align="center">
<a href="https://arxiv.org/abs/2403.07969">📃 Paper</a>
|
<a href="https://huggingface.co/collections/golaxy/knowcoder-65fc3cd385d98567da412abf" >🤗 Resource (Schema • Data • Model)</a>
|
<a href="https://ict-goknow.github.io/knowcoder/">🚀 Try KnowCoder (coming soon)!</a>
</p>
### Schema Following Data
The schema following data is constructed on [UniversalNER](https://huggingface.co/Universal-NER), [InstructIE](https://huggingface.co/datasets/zjunlp/InstructIE), and [LSEE](https://github.com/acl2017submission/event-data). The statistics of schema following data are presented as follows.
<p align="center">
<img src="https://github.com/ICT-GoKnow/ict-goknow.github.io/blob/main/knowcoder/static/images/schema-following.png?raw=true" style="width: 75%;">
</p>
The cases of schema following data are shown [here](https://github.com/ICT-GoKnow/ict-goknow.github.io/blob/main/knowcoder/static/images/Cases.md).
Due to data protection concerns, here we provide only 100 data samples for reference per task. |
NathanRoll/TalkBank_CA_wM_CLAPI | ---
dataset_info:
features:
- name: audio
sequence: float32
- name: text
dtype: string
- name: __index_level_0__
dtype: string
splits:
- name: train
num_bytes: 1651019172
num_examples: 24
download_size: 1653522367
dataset_size: 1651019172
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "TalkBank_CA_wM_CLAPI"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/DTD_parition1_test_google_flan_t5_xxl_mode_C_A_T_ns_1880 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_1_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_rices
num_bytes: 1236913
num_examples: 1880
- name: fewshot_3_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_rices
num_bytes: 2420309
num_examples: 1880
- name: fewshot_5_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_rices
num_bytes: 3604225
num_examples: 1880
- name: fewshot_0_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_rices
num_bytes: 644439
num_examples: 1880
- name: fewshot_0_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 543264
num_examples: 1880
- name: fewshot_0_clip_tags_ViT_L_14_Attributes_ViT_L_14_text_davinci_003_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 969589
num_examples: 1880
download_size: 2542967
dataset_size: 9418739
---
# Dataset Card for "DTD_parition1_test_google_flan_t5_xxl_mode_C_A_T_ns_1880"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibranze/araproje_mmlu_tr_s1 | ---
dataset_info:
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: validation
num_bytes: 137404.0
num_examples: 250
download_size: 0
dataset_size: 137404.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_mmlu_tr_s1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
allancasado04/bonito_privacy_qa_sft_data | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2336470
num_examples: 8738
- name: test
num_bytes: 580857
num_examples: 2185
download_size: 1207490
dataset_size: 2917327
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
sayan1101/reward_model_ranking_dataset_RLHF | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid1
path: data/valid1-*
- split: valid2
path: data/valid2-*
dataset_info:
features:
- name: prompt
dtype: string
- name: summary_3
dtype: string
- name: summary_4
dtype: string
- name: summary_1
dtype: string
- name: summary_2
dtype: string
splits:
- name: train
num_bytes: 260765
num_examples: 350
- name: test
num_bytes: 260765
num_examples: 350
- name: valid1
num_bytes: 260765
num_examples: 350
- name: valid2
num_bytes: 260765
num_examples: 350
download_size: 0
dataset_size: 1043060
---
# Dataset Card for "reward_model_ranking_dataset_RLHF"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
result-kand2-sdxl-wuerst-karlo/e87ec3b2 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 153
num_examples: 10
download_size: 1306
dataset_size: 153
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "e87ec3b2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mehmetadventlabs/secim2023-cb-partial | ---
license: mit
---
|
AdapterOcean/langchain-standardized_cluster_0_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 3993378
num_examples: 992
download_size: 1720254
dataset_size: 3993378
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "langchain-standardized_cluster_0_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Xnhyacinth/TQA-Image | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: answers
sequence: string
- name: target
dtype: string
- name: ctxs
list:
- name: id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
- name: compressed_ctxs_1
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
- name: compressed_ctxs_100
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
splits:
- name: train
num_bytes: 5395421949
num_examples: 78785
- name: eval
num_bytes: 605118800
num_examples: 8837
- name: test
num_bytes: 775128252
num_examples: 11313
download_size: 3917964666
dataset_size: 6775669001
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
- split: test
path: data/test-*
---
|
tyzhu/rareid_find_second_sent_train_10_eval_10 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 39358
num_examples: 30
- name: validation
num_bytes: 10863
num_examples: 10
download_size: 49044
dataset_size: 50221
---
# Dataset Card for "rareid_find_second_sent_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
edumunozsala/hate-speech-18-es | ---
dataset_info:
features:
- name: text_es
dtype: string
splits:
- name: train
num_bytes: 300365.41200657893
num_examples: 2937
download_size: 223824
dataset_size: 300365.41200657893
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-phpthinh__exampletx-constructive-7f6ba0-1708559815 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- phpthinh/exampletx
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-3b
metrics: []
dataset_name: phpthinh/exampletx
dataset_config: constructive
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-3b
* Dataset: phpthinh/exampletx
* Config: constructive
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@phpthinh](https://huggingface.co/phpthinh) for evaluating this model. |
open-llm-leaderboard/details_euclaise__gpt-neox-122m-minipile-digits | ---
pretty_name: Evaluation run of euclaise/gpt-neox-122m-minipile-digits
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [euclaise/gpt-neox-122m-minipile-digits](https://huggingface.co/euclaise/gpt-neox-122m-minipile-digits)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_euclaise__gpt-neox-122m-minipile-digits\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T13:51:30.117179](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__gpt-neox-122m-minipile-digits/blob/main/results_2023-09-22T13-51-30.117179.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n\
\ \"em_stderr\": 0.00033145814652193507,\n \"f1\": 0.010940645973154372,\n\
\ \"f1_stderr\": 0.0006046286914074817,\n \"acc\": 0.2616416732438832,\n\
\ \"acc_stderr\": 0.007018620654786819\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.00033145814652193507,\n\
\ \"f1\": 0.010940645973154372,\n \"f1_stderr\": 0.0006046286914074817\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5232833464877664,\n\
\ \"acc_stderr\": 0.014037241309573638\n }\n}\n```"
repo_url: https://huggingface.co/euclaise/gpt-neox-122m-minipile-digits
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T13_51_30.117179
path:
- '**/details_harness|drop|3_2023-09-22T13-51-30.117179.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T13-51-30.117179.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T13_51_30.117179
path:
- '**/details_harness|gsm8k|5_2023-09-22T13-51-30.117179.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T13-51-30.117179.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:54:44.863431.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:54:44.863431.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:54:44.863431.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T13_51_30.117179
path:
- '**/details_harness|winogrande|5_2023-09-22T13-51-30.117179.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T13-51-30.117179.parquet'
- config_name: results
data_files:
- split: 2023_07_19T13_54_44.863431
path:
- results_2023-07-19T13:54:44.863431.parquet
- split: 2023_09_22T13_51_30.117179
path:
- results_2023-09-22T13-51-30.117179.parquet
- split: latest
path:
- results_2023-09-22T13-51-30.117179.parquet
---
# Dataset Card for Evaluation run of euclaise/gpt-neox-122m-minipile-digits
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/euclaise/gpt-neox-122m-minipile-digits
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [euclaise/gpt-neox-122m-minipile-digits](https://huggingface.co/euclaise/gpt-neox-122m-minipile-digits) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_euclaise__gpt-neox-122m-minipile-digits",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T13:51:30.117179](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__gpt-neox-122m-minipile-digits/blob/main/results_2023-09-22T13-51-30.117179.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652193507,
"f1": 0.010940645973154372,
"f1_stderr": 0.0006046286914074817,
"acc": 0.2616416732438832,
"acc_stderr": 0.007018620654786819
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652193507,
"f1": 0.010940645973154372,
"f1_stderr": 0.0006046286914074817
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5232833464877664,
"acc_stderr": 0.014037241309573638
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
krishnakamath/shopping_intent | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
- split: labels_and_id
path: data/labels_and_id-*
dataset_info:
features:
- name: input
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 25829.303909205548
num_examples: 634
- name: validation
num_bytes: 3218.4779319041613
num_examples: 79
- name: test
num_bytes: 3259.21815889029
num_examples: 80
- name: labels_and_id
num_bytes: 56
num_examples: 2
download_size: 18333
dataset_size: 32363.0
---
# Dataset Card for "shopping_intent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
amogh-sinha/Llama-2-7B-Chat-GGML | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 39206
num_examples: 1
download_size: 16971
dataset_size: 39206
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Llama-2-7B-Chat-GGML"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rathi2023/bin_nhoodbox | ---
dataset_info:
features:
- name: image
dtype: image
- name: objects
struct:
- name: Ids
sequence: string
- name: bbox
sequence:
sequence: float64
- name: area
sequence: float64
- name: captions
sequence: string
- name: quantities
sequence: int64
splits:
- name: train
num_bytes: 1222200107.502
num_examples: 20243
download_size: 1144218394
dataset_size: 1222200107.502
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zaydzuhri/the_pile_tokenized_5percent_truncated_packed_v2 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 17294237972
num_examples: 2456775
download_size: 7744795432
dataset_size: 17294237972
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
edgarseverino/minhavozed | ---
license: openrail
---
|
Vitrola40/AnaPaulaValadao | ---
license: openrail
---
|
MyRebRIc/tomas | ---
license: openrail
---
|
benayas/atis_chatgpt_10pct_v2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 447879
num_examples: 4455
download_size: 154935
dataset_size: 447879
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
doushabao4766/ontonotes_zh_ner | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
splits:
- name: train
num_bytes: 7629700
num_examples: 15724
- name: test
num_bytes: 3188216
num_examples: 4346
- name: validation
num_bytes: 3074667
num_examples: 4301
download_size: 1852625
dataset_size: 13892583
---
# Dataset Card for "ontonotes_zh_ner"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
smcproject/ml-phonetic-lexicon | ---
license: cc-by-4.0
task_categories:
- text2text-generation
language:
- ml
pretty_name: Malayalam Grapheme Phoneme Dataset
configs:
- config_name: commonword
data_files: commonword.tsv
sep: "\t"
- config_name: english_loan
data_files: english_loan.tsv
sep: "\t"
- config_name: noun
data_files: noun.tsv
sep: "\t"
- config_name: personname
data_files: personname.tsv
sep: "\t"
- config_name: placename
data_files: placename.tsv
sep: "\t"
- config_name: pronoun
data_files: pronoun.tsv
sep: "\t"
- config_name: propernoun
data_files: propernoun.tsv
sep: "\t"
- config_name: sanskrit_loan
data_files: sanskrit_loan.tsv
sep: "\t"
- config_name: verb
data_files: verb.tsv
sep: "\t"
size_categories:
- 100K<n<1M
---
## Malayalam Phonetic Lexicon
This dataset contains words in Malayalam script and their pronunciation in International Phonetic Alphabet (IPA)
The words in the lexicon are sourced from
1) The most frequest 100 thousand words from [Indic NLP](https://github.com/AI4Bharat/indicnlp_corpus) corpus
2) Curated collection of word categories from [Mlmorph](https://pypi.org/project/mlmorph/) project
This pronunciations are created using [Mlphon](https://pypi.org/project/mlphon/) python Library.
### Applications
- Ready to use pronunciation lexicons for ASR and TTS
- To train datadriven grapheme to phoneme models
If you are using this dataset for any research, cite
```@ARTICLE{kavya2022mlphon,
author={Manohar, Kavya and Jayan, A. R. and Rajan, Rajeev},
journal={IEEE Access},
title={Mlphon: A Multifunctional Grapheme-Phoneme Conversion Tool Using Finite State Transducers},
year={2022},
volume={10},
number={},
pages={97555-97575},
doi={10.1109/ACCESS.2022.3204403}}
``` |
wilsonorozco/porfiriotorres | ---
license: openrail
---
|
AIBoy1993/Prompt-CHIP-CTC | ---
task_categories:
- text-classification
- question-answering
- text2text-generation
language:
- zh
size_categories:
- 1K<n<10K
--- |
CyberHarem/nephren_sukasuka | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Nephren Ruq Insania/ネフレン・ルク・インサニア (Shuumatsu Nani Shitemasu Ka? Isogashii Desu Ka?)
This is the dataset of Nephren Ruq Insania/ネフレン・ルク・インサニア (Shuumatsu Nani Shitemasu Ka? Isogashii Desu Ka?), containing 228 images and their tags.
The core tags of this character are `twintails, ribbon, grey_eyes, hair_ribbon, grey_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 228 | 162.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nephren_sukasuka/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 228 | 162.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nephren_sukasuka/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 413 | 280.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nephren_sukasuka/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nephren_sukasuka',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, closed_mouth, collarbone, expressionless, empty_eyes, hood, solo, blue_hair, upper_body |
| 1 | 7 |  |  |  |  |  | 1girl, blood, red_eyes, solo, white_hair, long_hair, upside-down, looking_at_viewer, lying |
| 2 | 6 |  |  |  |  |  | 1girl, blood, red_eyes, solo, clenched_teeth, injury, white_hair |
| 3 | 5 |  |  |  |  |  | 1girl, blood_on_clothes, blood_on_face, collarbone, open_mouth, solo, closed_eyes, injury, sitting, white_dress, dirty |
| 4 | 14 |  |  |  |  |  | 1girl, military_uniform, solo, upper_body, anime_coloring, jacket, buttons, looking_at_viewer, open_mouth, parted_lips |
| 5 | 14 |  |  |  |  |  | military_uniform, solo_focus, blue_hair, belt, blue_eyes, long_sleeves, 2girls, long_hair, 1girl, expressionless |
| 6 | 8 |  |  |  |  |  | 1boy, 1girl, profile, blue_hair, holding_spoon, school_uniform, 2girls, furry_male |
| 7 | 10 |  |  |  |  |  | bare_shoulders, collarbone, indoors, pink_bow, 1girl, solo, elbow_gloves, neck_bell, off_shoulder, pink_dress, white_gloves, window, closed_mouth, frilled_dress, night, upper_body, anime_coloring, strapless |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_mouth | collarbone | expressionless | empty_eyes | hood | solo | blue_hair | upper_body | blood | red_eyes | white_hair | long_hair | upside-down | looking_at_viewer | lying | clenched_teeth | injury | blood_on_clothes | blood_on_face | open_mouth | closed_eyes | sitting | white_dress | dirty | military_uniform | anime_coloring | jacket | buttons | parted_lips | solo_focus | belt | blue_eyes | long_sleeves | 2girls | 1boy | profile | holding_spoon | school_uniform | furry_male | bare_shoulders | indoors | pink_bow | elbow_gloves | neck_bell | off_shoulder | pink_dress | white_gloves | window | frilled_dress | night | strapless |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------------|:-----------------|:-------------|:-------|:-------|:------------|:-------------|:--------|:-----------|:-------------|:------------|:--------------|:--------------------|:--------|:-----------------|:---------|:-------------------|:----------------|:-------------|:--------------|:----------|:--------------|:--------|:-------------------|:-----------------|:---------|:----------|:--------------|:-------------|:-------|:------------|:---------------|:---------|:-------|:----------|:----------------|:-----------------|:-------------|:-----------------|:----------|:-----------|:---------------|:------------|:---------------|:-------------|:---------------|:---------|:----------------|:--------|:------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | | | | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | | | | | X | | | X | X | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | X | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 14 |  |  |  |  |  | X | | | | | | X | | X | | | | | | X | | | | | | X | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 14 |  |  |  |  |  | X | | | X | | | | X | | | | | X | | | | | | | | | | | | | X | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 6 | 8 |  |  |  |  |  | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | |
| 7 | 10 |  |  |  |  |  | X | X | X | | | | X | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
OneFly7/llama2-sst2-fine-tuning | ---
dataset_info:
features:
- name: label_text
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 23202578
num_examples: 67349
- name: validation
num_bytes: 334716
num_examples: 872
download_size: 4418625
dataset_size: 23537294
task_categories:
- text-classification
language:
- en
---
# Dataset Card for "llama2-sst2-finetuning"
## Dataset Description
The Llama2-sst2-fine-tuning dataset is designed for supervised fine-tuning of the LLaMA V2 based on the GLUE SST2 for sentiment analysis classification task.
We provide two subsets: training and validation.
To ensure the effectiveness of fine-tuning, we convert the data into the prompt template for LLaMA V2 supervised fine-tuning, where the data will follow this format:
```
<s>[INST] <<SYS>>
{System prompt}
<</SYS>>
{User prompt} [/INST] {Label} </s>.
```
The feasibility of this dataset has been tested in supervised fine-tuning on the meta-llama/Llama-2-7b-hf model.
Note. For the sake of simplicity, we have retained only one new column of data ('text').
## Other Useful Links
- [Get Llama 2 Prompt Format Right](https://www.reddit.com/r/LocalLLaMA/comments/155po2p/get_llama_2_prompt_format_right/)
- [Fine-Tune Your Own Llama 2 Model in a Colab Notebook](https://towardsdatascience.com/fine-tune-your-own-llama-2-model-in-a-colab-notebook-df9823a04a32)
- [Instruction fine-tuning Llama 2 with PEFT’s QLoRa method](https://medium.com/@ud.chandra/instruction-fine-tuning-llama-2-with-pefts-qlora-method-d6a801ebb19)
- [GLUE SST2 Dataset](https://www.tensorflow.org/datasets/catalog/glue#gluesst2)
<!--[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)--> |
iarejula/Higgs | ---
pretty_name: Higgs
tags:
- keyword2
- keyword1
---
# Higgs
Created from AIOD platform |
autoevaluate/autoeval-eval-adversarial_qa-adversarialQA-a69c9b-3342492274 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- adversarial_qa
eval_info:
task: extractive_question_answering
model: 2020uee0139/distilbert-base-uncased-finetuned-squad
metrics: []
dataset_name: adversarial_qa
dataset_config: adversarialQA
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: 2020uee0139/distilbert-base-uncased-finetuned-squad
* Dataset: adversarial_qa
* Config: adversarialQA
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@yahiya](https://huggingface.co/yahiya) for evaluating this model. |
CyberHarem/beres_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of beres (Fire Emblem)
This is the dataset of beres (Fire Emblem), containing 500 images and their tags.
The core tags of this character are `blue_eyes, blue_hair, breasts, bangs, long_hair, large_breasts, hair_between_eyes, medium_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 699.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/beres_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 400.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/beres_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1230 | 839.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/beres_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 620.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/beres_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1230 | 1.16 GiB | [Download](https://huggingface.co/datasets/CyberHarem/beres_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/beres_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, christmas, santa_hat, solo, santa_costume, pantyhose, simple_background, smile, white_gloves, fur_trim, blush, character_doll, closed_mouth, navel, shorts, white_background |
| 1 | 25 |  |  |  |  |  | 1girl, solo, upper_body, closed_mouth, simple_background, looking_at_viewer, cape, white_background, shoulder_armor, tassel, turtleneck |
| 2 | 5 |  |  |  |  |  | 1girl, armlet, bodice, bustier, capelet, closed_mouth, dagger, navel_cutout, sheathed, shoulder_armor, solo, tassel, turtleneck, upper_body, vambraces, midriff, short_sleeves, simple_background, black_armor, black_shorts, looking_at_viewer, sword, white_background |
| 3 | 22 |  |  |  |  |  | 1girl, armlet, black_shorts, brown_pantyhose, bustier, capelet, floral_print, legwear_under_shorts, pantyhose_under_shorts, print_legwear, short_shorts, shoulder_armor, tassel, turtleneck, bodice, dagger, looking_at_viewer, midriff, navel_cutout, patterned_clothing, short_sleeves, solo, vambraces, coat, waist_cape, lace-trimmed_legwear, simple_background, cloak, white_background, sheathed, closed_mouth, standing, teacher, black_armor, cowboy_shot, holding_sword, smile, sidelocks |
| 4 | 7 |  |  |  |  |  | 1girl, cape, solo, upper_body, armor, holding_sword, looking_at_viewer, closed_mouth, tassel |
| 5 | 5 |  |  |  |  |  | 1girl, black_skirt, closed_mouth, long_sleeves, simple_background, white_background, black_gloves, garreg_mach_monastery_uniform, looking_at_viewer, solo, blush, pink_hairband, black_cape, hand_on_hip, holding |
| 6 | 9 |  |  |  |  |  | midriff, 1girl, black_shorts, navel, short_shorts, solo, crop_top, legwear_under_shorts, looking_at_viewer, pantyhose_under_shorts, short_sleeves, blush, simple_background, cleavage_cutout, shirt, closed_mouth, smile |
| 7 | 5 |  |  |  |  |  | 1girl, alternate_costume, armlet, dancer, looking_at_viewer, simple_background, solo, cleavage, dress, jewelry, smile, closed_mouth, thighs, white_background, bare_shoulders, blush, cowboy_shot, grey_background, shawl |
| 8 | 7 |  |  |  |  |  | 1girl, black_bikini, cleavage, dagger, hair_flower, looking_at_viewer, official_alternate_costume, sheathed, solo, navel, cape, hibiscus, closed_mouth, cowboy_shot, simple_background, smile, white_background |
| 9 | 5 |  |  |  |  |  | black_bikini, blue_sky, cleavage, cloud, cowboy_shot, dagger, day, hair_flower, looking_at_viewer, navel, official_alternate_costume, sheathed, 1girl, outdoors, parted_lips, solo, tassel, black_capelet, hibiscus, ocean, smile, artist_name, beach, groin, open_mouth, red_flower, sitting, water |
| 10 | 6 |  |  |  |  |  | 1girl, black_bikini, blue_sky, cleavage, hair_flower, looking_at_viewer, navel, official_alternate_costume, outdoors, smile, solo, tassel, closed_mouth, cloud, day, ocean, hibiscus, red_flower, beach, upper_body, water |
| 11 | 13 |  |  |  |  |  | 1girl, completely_nude, looking_at_viewer, ocean, outdoors, solo, beach, blue_sky, day, nipples, smile, blush, collarbone, navel, pussy, water, mosaic_censoring, ass_visible_through_thighs, green_hair, cloud, wet, closed_mouth |
| 12 | 7 |  |  |  |  |  | 1girl, armpits, arms_up, black_hair, blush, cleavage, glasses, looking_at_viewer, smile, solo, collarbone, huge_breasts, skindentation, black-framed_eyewear, upper_body, arms_behind_head, closed_mouth, navel, black_bikini, black_bra, choker, green_hair, indoors, semi-rimless_eyewear |
| 13 | 8 |  |  |  |  |  | 1girl, enmaided, maid_headdress, solo, simple_background, maid_apron, white_background, blush, looking_at_viewer, black_dress, closed_mouth, frills, glasses, holding, juliet_sleeves, red-framed_eyewear |
| 14 | 12 |  |  |  |  |  | fake_animal_ears, playboy_bunny, rabbit_ears, cleavage, detached_collar, looking_at_viewer, 1girl, simple_background, solo, alternate_costume, black_leotard, blush, wrist_cuffs, bare_shoulders, pantyhose, smile, closed_mouth, strapless_leotard, sitting, white_background |
| 15 | 5 |  |  |  |  |  | 1girl, bell, christmas, fake_animal_ears, fake_antlers, looking_at_viewer, solo, blush, fishnet_thighhighs, garter_straps, red_panties, alternate_costume, ass, collarbone, reindeer_antlers, simple_background, white_background, bare_shoulders, choker, cleavage, crossed_arms, from_behind, fur_trim, lingerie, looking_back, navel, smile, topless |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | christmas | santa_hat | solo | santa_costume | pantyhose | simple_background | smile | white_gloves | fur_trim | blush | character_doll | closed_mouth | navel | shorts | white_background | upper_body | looking_at_viewer | cape | shoulder_armor | tassel | turtleneck | armlet | bodice | bustier | capelet | dagger | navel_cutout | sheathed | vambraces | midriff | short_sleeves | black_armor | black_shorts | sword | brown_pantyhose | floral_print | legwear_under_shorts | pantyhose_under_shorts | print_legwear | short_shorts | patterned_clothing | coat | waist_cape | lace-trimmed_legwear | cloak | standing | teacher | cowboy_shot | holding_sword | sidelocks | armor | black_skirt | long_sleeves | black_gloves | garreg_mach_monastery_uniform | pink_hairband | black_cape | hand_on_hip | holding | crop_top | cleavage_cutout | shirt | alternate_costume | dancer | cleavage | dress | jewelry | thighs | bare_shoulders | grey_background | shawl | black_bikini | hair_flower | official_alternate_costume | hibiscus | blue_sky | cloud | day | outdoors | parted_lips | black_capelet | ocean | artist_name | beach | groin | open_mouth | red_flower | sitting | water | completely_nude | nipples | collarbone | pussy | mosaic_censoring | ass_visible_through_thighs | green_hair | wet | armpits | arms_up | black_hair | glasses | huge_breasts | skindentation | black-framed_eyewear | arms_behind_head | black_bra | choker | indoors | semi-rimless_eyewear | enmaided | maid_headdress | maid_apron | black_dress | frills | juliet_sleeves | red-framed_eyewear | fake_animal_ears | playboy_bunny | rabbit_ears | detached_collar | black_leotard | wrist_cuffs | strapless_leotard | bell | fake_antlers | fishnet_thighhighs | garter_straps | red_panties | ass | reindeer_antlers | crossed_arms | from_behind | lingerie | looking_back | topless |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:------------|:------------|:-------|:----------------|:------------|:--------------------|:--------|:---------------|:-----------|:--------|:-----------------|:---------------|:--------|:---------|:-------------------|:-------------|:--------------------|:-------|:-----------------|:---------|:-------------|:---------|:---------|:----------|:----------|:---------|:---------------|:-----------|:------------|:----------|:----------------|:--------------|:---------------|:--------|:------------------|:---------------|:-----------------------|:-------------------------|:----------------|:---------------|:---------------------|:-------|:-------------|:-----------------------|:--------|:-----------|:----------|:--------------|:----------------|:------------|:--------|:--------------|:---------------|:---------------|:--------------------------------|:----------------|:-------------|:--------------|:----------|:-----------|:------------------|:--------|:--------------------|:---------|:-----------|:--------|:----------|:---------|:-----------------|:------------------|:--------|:---------------|:--------------|:-----------------------------|:-----------|:-----------|:--------|:------|:-----------|:--------------|:----------------|:--------|:--------------|:--------|:--------|:-------------|:-------------|:----------|:--------|:------------------|:----------|:-------------|:--------|:-------------------|:-----------------------------|:-------------|:------|:----------|:----------|:-------------|:----------|:---------------|:----------------|:-----------------------|:-------------------|:------------|:---------|:----------|:-----------------------|:-----------|:-----------------|:-------------|:--------------|:---------|:-----------------|:---------------------|:-------------------|:----------------|:--------------|:------------------|:----------------|:--------------|:--------------------|:-------|:---------------|:---------------------|:----------------|:--------------|:------|:-------------------|:---------------|:--------------|:-----------|:---------------|:----------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 25 |  |  |  |  |  | X | | | X | | | X | | | | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | | X | | | X | | | | | | X | | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 22 |  |  |  |  |  | X | | | X | | | X | X | | | | | X | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | | X | | | | | | | | | X | | | | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | X | | | X | | | | X | | X | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | | | X | | | X | X | | | X | | X | X | | | | X | | | | | | | | | | | | | X | X | | X | | | | X | X | | X | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | | X | | | X | X | | | X | | X | | | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | | | X | | | X | X | | | | | X | X | | X | | X | X | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | | | X | | | | X | | | | | | X | | | | X | | | X | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 6 |  |  |  |  |  | X | | | X | | | | X | | | | | X | X | | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | | | X | | X | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 13 |  |  |  |  |  | X | | | X | | | | X | | | X | | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | X | | X | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 12 | 7 |  |  |  |  |  | X | | | X | | | | X | | | X | | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 13 | 8 |  |  |  |  |  | X | | | X | | | X | | | | X | | X | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 14 | 12 |  |  |  |  |  | X | | | X | | X | X | X | | | X | | X | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 15 | 5 |  |  |  |  |  | X | X | | X | | | X | X | | X | X | | | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
irds/neumarco_fa | ---
pretty_name: '`neumarco/fa`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `neumarco/fa`
The `neumarco/fa` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/neumarco#neumarco/fa).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=8,841,823
This dataset is used by: [`neumarco_fa_dev`](https://huggingface.co/datasets/irds/neumarco_fa_dev), [`neumarco_fa_dev_judged`](https://huggingface.co/datasets/irds/neumarco_fa_dev_judged), [`neumarco_fa_dev_small`](https://huggingface.co/datasets/irds/neumarco_fa_dev_small), [`neumarco_fa_train`](https://huggingface.co/datasets/irds/neumarco_fa_train), [`neumarco_fa_train_judged`](https://huggingface.co/datasets/irds/neumarco_fa_train_judged)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/neumarco_fa', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
|
bigscience-data/roots_indic-mr_indic_nlp_corpus | ---
language: mr
license: cc-by-nc-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_indic-mr_indic_nlp_corpus
# Indic NLP Corpus
- Dataset uid: `indic_nlp_corpus`
### Description
The IndicNLP corpus is a largescale, general-domain corpus containing 2.7 billion words for 10 Indian languages from two language families. s (IndoAryan branch and Dravidian). Each language has at least 100 million words (except Oriya).
### Homepage
https://github.com/AI4Bharat/indicnlp_corpus#publicly-available-classification-datasets
### Licensing
- non-commercial use
- cc-by-nc-sa-4.0: Creative Commons Attribution Non Commercial Share Alike 4.0 International
### Speaker Locations
- Southern Asia
- India
### Sizes
- 3.4019 % of total
- 44.4368 % of indic-hi
- 64.2943 % of indic-ta
- 70.5374 % of indic-ml
- 54.2394 % of indic-te
- 55.9105 % of indic-kn
- 61.6111 % of indic-mr
- 67.2242 % of indic-pa
- 68.1470 % of indic-or
- 64.3879 % of indic-gu
- 4.1495 % of indic-bn
### BigScience processing steps
#### Filters applied to: indic-hi
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ta
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-pa
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-or
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
#### Filters applied to: indic-gu
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
|
open-llm-leaderboard/details_Brillibits__Instruct_Llama70B_Dolly15k | ---
pretty_name: Evaluation run of Brillibits/Instruct_Llama70B_Dolly15k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Brillibits/Instruct_Llama70B_Dolly15k](https://huggingface.co/Brillibits/Instruct_Llama70B_Dolly15k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Brillibits__Instruct_Llama70B_Dolly15k_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-07T07:12:49.365073](https://huggingface.co/datasets/open-llm-leaderboard/details_Brillibits__Instruct_Llama70B_Dolly15k_public/blob/main/results_2023-11-07T07-12-49.365073.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2294463087248322,\n\
\ \"em_stderr\": 0.004306075513502917,\n \"f1\": 0.2826310822147651,\n\
\ \"f1_stderr\": 0.004256290262260348,\n \"acc\": 0.6348872917405918,\n\
\ \"acc_stderr\": 0.01192527682309685\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2294463087248322,\n \"em_stderr\": 0.004306075513502917,\n\
\ \"f1\": 0.2826310822147651,\n \"f1_stderr\": 0.004256290262260348\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4268385140257771,\n \
\ \"acc_stderr\": 0.013624249696595222\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8429360694554064,\n \"acc_stderr\": 0.010226303949598477\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Brillibits/Instruct_Llama70B_Dolly15k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_11_07T07_12_49.365073
path:
- '**/details_harness|drop|3_2023-11-07T07-12-49.365073.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-07T07-12-49.365073.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_07T07_12_49.365073
path:
- '**/details_harness|gsm8k|5_2023-11-07T07-12-49.365073.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-07T07-12-49.365073.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_07T07_12_49.365073
path:
- '**/details_harness|winogrande|5_2023-11-07T07-12-49.365073.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-07T07-12-49.365073.parquet'
- config_name: results
data_files:
- split: 2023_11_07T07_12_49.365073
path:
- results_2023-11-07T07-12-49.365073.parquet
- split: latest
path:
- results_2023-11-07T07-12-49.365073.parquet
---
# Dataset Card for Evaluation run of Brillibits/Instruct_Llama70B_Dolly15k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Brillibits/Instruct_Llama70B_Dolly15k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Brillibits/Instruct_Llama70B_Dolly15k](https://huggingface.co/Brillibits/Instruct_Llama70B_Dolly15k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Brillibits__Instruct_Llama70B_Dolly15k_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-07T07:12:49.365073](https://huggingface.co/datasets/open-llm-leaderboard/details_Brillibits__Instruct_Llama70B_Dolly15k_public/blob/main/results_2023-11-07T07-12-49.365073.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2294463087248322,
"em_stderr": 0.004306075513502917,
"f1": 0.2826310822147651,
"f1_stderr": 0.004256290262260348,
"acc": 0.6348872917405918,
"acc_stderr": 0.01192527682309685
},
"harness|drop|3": {
"em": 0.2294463087248322,
"em_stderr": 0.004306075513502917,
"f1": 0.2826310822147651,
"f1_stderr": 0.004256290262260348
},
"harness|gsm8k|5": {
"acc": 0.4268385140257771,
"acc_stderr": 0.013624249696595222
},
"harness|winogrande|5": {
"acc": 0.8429360694554064,
"acc_stderr": 0.010226303949598477
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jlbaker361/anime_faces_mini | ---
dataset_info:
features:
- name: image
dtype: image
- name: split
dtype: string
- name: src
dtype: string
- name: style
dtype: string
splits:
- name: train
num_bytes: 612306.0
num_examples: 10
download_size: 612207
dataset_size: 612306.0
---
# Dataset Card for "anime_faces_mini"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713011850 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 13694
num_examples: 30
download_size: 11049
dataset_size: 13694
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713011850"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
copenlu/wiki-stance | ---
license: cc-by-sa-4.0
---
|
the_pile_books3 | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- mit
multilinguality:
- monolingual
pretty_name: Books3
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
viewer: false
dataset_info:
features:
- name: title
dtype: string
- name: text
dtype: string
config_name: plain_text
splits:
- name: train
num_bytes: 108392037000
num_examples: 196639
download_size: 39516981435
dataset_size: 108392037000
---
# Dataset Card for the_pile_books3
## Table of Contents
- [Dataset Card for the_pile_books3](#dataset-card-for-the_pile_books3)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [|split|num examples|](#splitnum-examples)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [GitHub](https://github.com/soskek/bookcorpus/issues/27#issuecomment-716104208)
- **Repository:** [Needs More Information]
- **Paper:** [arXiv](https://arxiv.org/abs/2101.00027)
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Needs More Information]
### Dataset Summary
<div class="course-tip course-tip-orange bg-gradient-to-br dark:bg-gradient-to-r before:border-orange-500 dark:before:border-orange-800 from-orange-50 dark:from-gray-900 to-white dark:to-gray-950 border border-orange-50 text-orange-700 dark:text-gray-400">
<p><b>Defunct:</b> Dataset "the_pile_books3" is defunct and no longer accessible due to reported copyright infringement.</p>
</div>
This dataset is Shawn Presser's work and is part of EleutherAi/The Pile dataset.
This dataset contains all of bibliotik in plain .txt form, aka 197,000 books processed in exactly the same way as did for bookcorpusopen (a.k.a. books1). seems to be similar to OpenAI's mysterious "books2" dataset referenced in their papers. Unfortunately OpenAI will not give details, so we know very little about any differences. People suspect it's "all of libgen", but it's purely conjecture.
|download_size|36.8 Gib|
|dataset_size|100.9 Gib|
### Supported Tasks and Leaderboards
This dataset is used for Language Modeling.
### Languages
The dataset is in English.
## Dataset Structure
### Data Instances
```
{'title': '07 LEGO Ninjago - The Search For Zane (Scholastic) - Kate Howard (retail)'
'text': '\n\nTITLE PAGE\n\nFROM THE JOURNAL OF SENSEI GARMADON\n\nCHAPTER 1\n\nCHAPTER 2\n\nCHAPTER 3\n\nCHAPTER 4\n\nCHAPTER 5\n\nCHAPTER 6\n\nCHAPTER 7\n\nCHAPTER 8\n\nCHAPTER 9\n\nCOPYRIGHT\n\nThroughout Ninjago", five ninja are well-known for their speed, strength, and of course the elemental powers that help them protect our world from evil. But there are others who possess some of the same powers as the ninja. Others who may not always use their powers for good.\n\nBefore now, the ninja believed they were special. They di.......'}
```
### Data Fields
- `title`: title of the book
- `text`: text content of the book
### Data Splits
|split|num examples|
--------------------------------
|train|196640|
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
MIT
### Citation Information
```
@article{pile,
title={The {P}ile: An 800GB Dataset of Diverse Text for Language Modeling},
author={Gao, Leo and Biderman, Stella and Black, Sid and Golding, Laurence and Hoppe, Travis and Foster, Charles and Phang, Jason and He, Horace and Thite, Anish and Nabeshima, Noa and Presser, Shawn and Leahy, Connor},
journal={arXiv preprint arXiv:2101.00027},
year={2020}
}
```
### Contributions
Thanks to [@shawwn](https://github.com/shawwn) for creating this dataset.
Thanks to [@richarddwang](https://github.com/richarddwang) for adding this dataset. |
casecrit/2024-indonesian-election | ---
license: cc-by-nc-2.0
language:
- id
size_categories:
- 10M<n<100M
---
The dataset encompasses news articles spanning from November 29, 2023, to February 6, 2024, capturing the discourse surrounding the five presidential debates orchestrated by the General Elections Commission. Sourced from reputable platforms such as detik, kompas, and liputan6, the dataset offers a comprehensive insight into the electoral landscape and the media coverage thereof. |
BangumiBase/nichijou | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Nichijou
This is the image base of bangumi Nichijou, we detected 33 characters, 2652 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 346 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 16 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 51 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 449 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 105 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 10 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 75 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 91 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 73 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 16 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 479 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 33 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 72 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 75 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 79 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 19 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 17 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 80 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 30 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 181 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 16 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 15 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 36 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 100 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 13 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 33 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 14 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 7 | [Download](27/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 28 | 9 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 14 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 12 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 22 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 64 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
AdapterOcean/gorilla_16k_standardized_cluster_4_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 5005609
num_examples: 8256
download_size: 1950794
dataset_size: 5005609
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gorilla_16k_standardized_cluster_4_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
victorzarzu/interior-design-prompt-editing-dataset-test | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: input_prompt
dtype: string
- name: edit_prompt
dtype: string
- name: output_prompt
dtype: string
- name: designed_image
dtype: image
splits:
- name: train
num_bytes: 389110990.624
num_examples: 1129
download_size: 386809444
dataset_size: 389110990.624
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
carlos19850/Cartman_images | ---
license: mit
---
|
om-ashish-soni/shiv-mahapurana-text | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 299549
num_examples: 842
download_size: 163413
dataset_size: 299549
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Arnaldo34/Myvoice.mp3 | ---
license: openrail
---
|
positivethoughts/alpaca_merge_rewrite_13.3k | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 53707647
num_examples: 13365
download_size: 31667125
dataset_size: 53707647
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
danjacobellis/MPQv1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: quality
dtype: float32
- name: correct
dtype: float64
splits:
- name: train
num_bytes: 123720.0
num_examples: 10310
- name: test
num_bytes: 30936.0
num_examples: 2578
download_size: 79295
dataset_size: 154656.0
---
# Dataset Card for "MPQv1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JotDe/birds | ---
dataset_info:
features:
- name: image
dtype: image
- name: description
dtype: string
- name: label
dtype:
class_label:
names:
'0': Black footed Albatross
'1': Laysan Albatross
'2': Sooty Albatross
'3': Groove billed Ani
'4': Crested Auklet
'5': Least Auklet
'6': Parakeet Auklet
'7': Rhinoceros Auklet
'8': Brewer Blackbird
'9': Red winged Blackbird
'10': Rusty Blackbird
'11': Yellow headed Blackbird
'12': Bobolink
'13': Indigo Bunting
'14': Lazuli Bunting
'15': Painted Bunting
'16': Cardinal
'17': Spotted Catbird
'18': Gray Catbird
'19': Yellow breasted Chat
'20': Eastern Towhee
'21': Chuck will Widow
'22': Brandt Cormorant
'23': Red faced Cormorant
'24': Pelagic Cormorant
'25': Bronzed Cowbird
'26': Shiny Cowbird
'27': Brown Creeper
'28': American Crow
'29': Fish Crow
'30': Black billed Cuckoo
'31': Mangrove Cuckoo
'32': Yellow billed Cuckoo
'33': Gray crowned Rosy Finch
'34': Purple Finch
'35': Northern Flicker
'36': Acadian Flycatcher
'37': Great Crested Flycatcher
'38': Least Flycatcher
'39': Olive sided Flycatcher
'40': Scissor tailed Flycatcher
'41': Vermilion Flycatcher
'42': Yellow bellied Flycatcher
'43': Frigatebird
'44': Northern Fulmar
'45': Gadwall
'46': American Goldfinch
'47': European Goldfinch
'48': Boat tailed Grackle
'49': Eared Grebe
'50': Horned Grebe
'51': Pied billed Grebe
'52': Western Grebe
'53': Blue Grosbeak
'54': Evening Grosbeak
'55': Pine Grosbeak
'56': Rose breasted Grosbeak
'57': Pigeon Guillemot
'58': California Gull
'59': Glaucous winged Gull
'60': Heermann Gull
'61': Herring Gull
'62': Ivory Gull
'63': Ring billed Gull
'64': Slaty backed Gull
'65': Western Gull
'66': Anna Hummingbird
'67': Ruby throated Hummingbird
'68': Rufous Hummingbird
'69': Green Violetear
'70': Long tailed Jaeger
'71': Pomarine Jaeger
'72': Blue Jay
'73': Florida Jay
'74': Green Jay
'75': Dark eyed Junco
'76': Tropical Kingbird
'77': Gray Kingbird
'78': Belted Kingfisher
'79': Green Kingfisher
'80': Pied Kingfisher
'81': Ringed Kingfisher
'82': White breasted Kingfisher
'83': Red legged Kittiwake
'84': Horned Lark
'85': Pacific Loon
'86': Mallard
'87': Western Meadowlark
'88': Hooded Merganser
'89': Red breasted Merganser
'90': Mockingbird
'91': Nighthawk
'92': Clark Nutcracker
'93': White breasted Nuthatch
'94': Baltimore Oriole
'95': Hooded Oriole
'96': Orchard Oriole
'97': Scott Oriole
'98': Ovenbird
'99': Brown Pelican
'100': White Pelican
'101': Western Wood Pewee
'102': Sayornis
'103': American Pipit
'104': Whip poor Will
'105': Horned Puffin
'106': Common Raven
'107': White necked Raven
'108': American Redstart
'109': Geococcyx
'110': Loggerhead Shrike
'111': Great Grey Shrike
'112': Baird Sparrow
'113': Black throated Sparrow
'114': Brewer Sparrow
'115': Chipping Sparrow
'116': Clay colored Sparrow
'117': House Sparrow
'118': Field Sparrow
'119': Fox Sparrow
'120': Grasshopper Sparrow
'121': Harris Sparrow
'122': Henslow Sparrow
'123': Le Conte Sparrow
'124': Lincoln Sparrow
'125': Nelson Sharp tailed Sparrow
'126': Savannah Sparrow
'127': Seaside Sparrow
'128': Song Sparrow
'129': Tree Sparrow
'130': Vesper Sparrow
'131': White crowned Sparrow
'132': White throated Sparrow
'133': Cape Glossy Starling
'134': Bank Swallow
'135': Barn Swallow
'136': Cliff Swallow
'137': Tree Swallow
'138': Scarlet Tanager
'139': Summer Tanager
'140': Artic Tern
'141': Black Tern
'142': Caspian Tern
'143': Common Tern
'144': Elegant Tern
'145': Forsters Tern
'146': Least Tern
'147': Green tailed Towhee
'148': Brown Thrasher
'149': Sage Thrasher
'150': Black capped Vireo
'151': Blue headed Vireo
'152': Philadelphia Vireo
'153': Red eyed Vireo
'154': Warbling Vireo
'155': White eyed Vireo
'156': Yellow throated Vireo
'157': Bay breasted Warbler
'158': Black and white Warbler
'159': Black throated Blue Warbler
'160': Blue winged Warbler
'161': Canada Warbler
'162': Cape May Warbler
'163': Cerulean Warbler
'164': Chestnut sided Warbler
'165': Golden winged Warbler
'166': Hooded Warbler
'167': Kentucky Warbler
'168': Magnolia Warbler
'169': Mourning Warbler
'170': Myrtle Warbler
'171': Nashville Warbler
'172': Orange crowned Warbler
'173': Palm Warbler
'174': Pine Warbler
'175': Prairie Warbler
'176': Prothonotary Warbler
'177': Swainson Warbler
'178': Tennessee Warbler
'179': Wilson Warbler
'180': Worm eating Warbler
'181': Yellow Warbler
'182': Northern Waterthrush
'183': Louisiana Waterthrush
'184': Bohemian Waxwing
'185': Cedar Waxwing
'186': American Three toed Woodpecker
'187': Pileated Woodpecker
'188': Red bellied Woodpecker
'189': Red cockaded Woodpecker
'190': Red headed Woodpecker
'191': Downy Woodpecker
'192': Bewick Wren
'193': Cactus Wren
'194': Carolina Wren
'195': House Wren
'196': Marsh Wren
'197': Rock Wren
'198': Winter Wren
'199': Common Yellowthroat
- name: file_name
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 583864786.046
num_examples: 5994
- name: test
num_bytes: 577090882.934
num_examples: 5794
download_size: 1148682949
dataset_size: 1160955668.98
---
# Dataset Card for "birds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibm-nasa-geospatial/multi-temporal-crop-classification | ---
license: cc-by-4.0
language:
- en
tags:
- remote sensing
- segmentation
- crop type
- foundation model
size_categories:
- 1K<n<10K
---
# Dataset Card for Multi-Temporal Crop Classification
## Dataset Description
- **Homepage: https://huggingface.co/datasets/ibm-nasa-geospatial/cdl-crops/**
- **Point of Contact: Dr. Hamed Alemohammad (halemohammad@clarku.edu)**
### Dataset Summary
This dataset contains temporal Harmonized Landsat-Sentinel imagery of diverse land cover and crop type classes across the Contiguous United States for the year 2022. The target labels are derived from USDA's Crop Data Layer (CDL). It's primary purpose is for training segmentation geospatial machine learning models.
### Dataset Structure
## TIFF Files
Each tiff file covers a 224 x 224 pixel area at 30m spatial resolution. Each input satellite file contains 18 bands including 6 spectral bands for three time steps stacked together. Each GeoTIFF file for the mask contains one band with the target classes for each pixel.
## Band Order
In each input GeoTIFF the following bands are repeated three times for three observations throughout the growing season:
Channel, Name, HLS S30 Band number
1, Blue, B02
2, Green, B03
3, Red, B04
4, NIR, B8A
5, SW 1, B11
6, SW 2, B12
Masks are a single band with values:
0 : "No Data"
1 : "Natural Vegetation"
2 : "Forest"
3 : "Corn"
4 : "Soybeans"
5 : "Wetlands"
6 : "Developed/Barren"
7 : "Open Water"
8 : "Winter Wheat"
9 : "Alfalfa"
10 : "Fallow/Idle Cropland"
11 : "Cotton"
12 : "Sorghum"
13 : "Other"
## Class Distribution
### Training Data Distribution

### Validation Data Distribution

## Data Splits
The 3,854 chips have been randomly split into training (80%) and validation (20%) with corresponding ids recorded in cvs files `train_data.txt` and `validation_data.txt`.
## Dataset Creation
### Query and Scene Selection
First, a set of 5,000 chips were defined based on samples from the USDA CDL to ensure a representative sampling across the CONUS. Next, for each chip, the corresponding HLS S30 scenes between March and September 2022 were queried, and scenes with low cloud cover were retrieved. Then, three scenes are selected among the low cloudy scenes to ensure a scene from early in the season, one in the middle, and one toward the end. The three final scenes were then reprojected to CDL's projection grid (`EPSG:5070`) using bilinear interpolation.
### Chip Generation
In the final step, the three scenes for each chip were clipped to the bounding box of the chip, and 18 spectral bands were stacked together. In addition, a quality control was applied to each chip using the `Fmask` layer of the HLS dataset. Any chip containing clouds, cloud shadow, adjacent to cloud or missing values were discarded. This resulted in 3,854 chips.
### Dataset Download
You can download the data in `.tgz` format from this repository (you need to install [Git Large File Sotrage](https://git-lfs.com/) for this). The same version of the data is hosted on [Source Cooperative](https://beta.source.coop/repositories/clarkcga/multi-temporal-crop-classification/description) as objects on AWS S3.
### Citation
If this dataset helped your research, please cite `hls-multi-temporal-crop-classification` in your publications. Here is an example BibTeX entry:
```
@misc{hls-multi-temporal-crop-classification,
author = {Cecil, Michael and Kordi, Fatemehand Li, Hanxi (Steve) and Khallaghi, Sam and Alemohammad, Hamed},
doi = {10.57967/hf/0955},
month = aug,
title = {{HLS Multi Temporal Crop Classification}},
url = {https://huggingface.co/ibm-nasa-geospatial/multi-temporal-crop-classification},
year = {2023}
}
``` |
Isaak-Carter/Function_Calling_Private_GG | ---
license: apache-2.0
task_categories:
- text-generation
- conversational
pretty_name: Funcion Calling Like A Champ
size_categories:
- 100K<n<1M
---
# Function Recommendation Dataset Readme
## Description
This dataset is based on the "glaiveai/glaive-function-calling" repository and has been customized to suit my specific requirements. It is designed for fine-tuning a Large Language Model (LLM) on the task of generating function recommendations in a conversational context. The dataset contains 158,738 snippets of conversations between me and my virtual assistant named J.O.S.I.E. (Just an Outstandingly Smart Intelligent Entity), who only assists me with various tasks by recommending functions and providing relevant information.
Each snippet in the dataset represents a conversation between the user and J.O.S.I.E. The conversations typically follow this structure:
1. The conversation begins with an introduction of J.O.S.I.E. and its capabilities.
2. The user asks a question or requests assistance related to a specific task.
3. J.O.S.I.E. responds by recommending a function call that can help fulfill the user's request.
4. The user may ask follow-up questions or request additional recommendations.
5. J.O.S.I.E. continues to provide function recommendations and responses as needed.
## Snippet Example
Here is an example snippet from the dataset:
```markdown
### SYSTEM: You are Gökdeniz Gülmez's private assistant named J.O.S.I.E. (Just an Outstandingly Smart Intelligent Entity) who has access to the following functions to exclusively help Gökdeniz Gülmez, you can use the functions if needed-
{
"name": "recommend_movies",
"description": "Recommend movies based on user preferences",
"parameters": {
"type": "object",
"properties": {
"genre": {
"type": "string",
"description": "The preferred movie genre"
},
"year_range": {
"type": "object",
"properties": {
"start_year": {
"type": "integer",
"description": "The start year of the movie release"
},
"end_year": {
"type": "integer",
"description": "The end year of the movie release"
}
}
}
}
}
}
### Gökdeniz Gülmez: Can you recommend some movies for me?
### J.O.S.I.E.: Sure! I can recommend movies based on your preferences. Please provide me with your preferred movie genre and the year range of the movie release.
### Gökdeniz Gülmez: I enjoy action movies from the 2000s.
### J.O.S.I.E.: <functioncall> {"name": "recommend_movies", "arguments": '{
"genre": "action",
"year_range": {
"start_year": 2000,
"end_year": 2009
}
}'}
### FUNCTION RESPONSE: {"movies": ["The Dark Knight", "Gladiator", "The Bourne Identity", "Kill Bill: Volume 1", "The Matrix"]}
### J.O.S.I.E.: Based on your preferences, I recommend the following movies: "The Dark Knight", "Gladiator", "The Bourne Identity", "Kill Bill: Volume 1", and "The Matrix".
```
In this example, the user asks J.O.S.I.E. to recommend action movies from the 2000s. J.O.S.I.E. responds with a function call to the "recommend_movies" function and provides a list of recommended movies as a response.
## Dataset Usage
This dataset can be used for training and fine-tuning Large Language Models (LLMs) such as GPT-3.5 on the task of generating function recommendations in a conversational context. Researchers and developers can use this data to build virtual assistants or chatbots capable of recommending functions and providing relevant information to users based on their requests.
## Citation
If you use this dataset in your research or applications, please cite it as follows:
```
@dataset{your citation here,
title = {Private Function Calling},
author = {Gökdeniz Gülmez},
year = {2023},
publisher = {Gökdeniz Gülmez},
url = {https://huggingface.co/datasets/Isaak-Carter/Function_Calling_Private_GG/tree/main},
}
``` |
Sangjeong/TestData | ---
license: apache-2.0
---
|
polejowska/cd45rb | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
list:
- name: category_id
dtype:
class_label:
names:
'0': leukocyte
- name: image_id
dtype: string
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: segmentation
list:
list: float32
- name: iscrowd
dtype: bool
splits:
- name: train
num_bytes: 35879463408.88
num_examples: 18421
- name: valid
num_bytes: 3475442128.938
num_examples: 1781
- name: test
num_bytes: 4074586864.944
num_examples: 2116
download_size: 43275144782
dataset_size: 43429492402.762
---
# Dataset Card for "cd45rb"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ruffusplay/ajolote2 | ---
license: openrail
---
|
davanstrien/haiku-kto | ---
dataset_info:
features:
- name: prompt
dtype: string
id: field
- name: completion
dtype: string
id: field
- name: label
dtype: bool
- name: label-suggestion
dtype: string
id: suggestion
- name: label-suggestion-metadata
struct:
- name: type
dtype: string
id: suggestion-metadata
- name: score
dtype: float32
id: suggestion-metadata
- name: agent
dtype: string
id: suggestion-metadata
- name: external_id
dtype: string
id: external_id
- name: metadata
dtype: string
id: metadata
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 5503
num_examples: 11
download_size: 10862
dataset_size: 5503
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
weqweasdas/ultra_prompt_split | ---
configs:
- config_name: default
data_files:
- split: prompt0
path: data/prompt0-*
- split: prompt1
path: data/prompt1-*
- split: prompt2
path: data/prompt2-*
- split: prompt3
path: data/prompt3-*
dataset_info:
features:
- name: prompt
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: prompt0
num_bytes: 10692848
num_examples: 14999
- name: prompt1
num_bytes: 10583178
num_examples: 14999
- name: prompt2
num_bytes: 10364121
num_examples: 14999
- name: prompt3
num_bytes: 10362331
num_examples: 14999
download_size: 25308535
dataset_size: 42002478
---
# Dataset Card for "ultra_prompt_split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
edwardgiamphy/random_deletion_noise_eval_dataset.csv | ---
license: apache-2.0
---
decription: MS MARCO Passage ranking test dataset with 3% of Character Error Rate (CER) added with character-level deletion noise type in the text.
|
Thaweewat/glue-mrpc-th | ---
license: cc-by-4.0
---
|
philschmid/chip2_en_code | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 1750565
num_examples: 3300
download_size: 517363
dataset_size: 1750565
---
# Dataset Card for "chip2_en_code"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Symfomany/datasllm2 | ---
license: apache-2.0
---
|
yut304/test_dataset | ---
license: mit
---
|
heegyu/OpenOrca-gugugo-ko-len100 | ---
dataset_info:
features:
- name: system_prompt
dtype: string
- name: response
dtype: string
- name: question
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 1557805553.6333735
num_examples: 1369825
download_size: 861133834
dataset_size: 1557805553.6333735
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
- Original dataset: [squarelike/OpenOrca-gugugo-ko](https://huggingface.co/datasets/squarelike/OpenOrca-gugugo-ko)
- 길이가 100 이상인 답변만 필터링
|
akianu/test1 | ---
license: other
license_name: test1
license_link: LICENSE
---
|
creative-graphic-design/Magazine | ---
annotations_creators:
- machine-generated
language:
- en
language_creators:
- found
license:
- unknown
multilinguality:
- monolingual
pretty_name: Magazine
size_categories: []
source_datasets:
- original
tags:
- graphic design
- layout
- content-aware
task_categories:
- image-to-image
- text-to-image
- unconditional-image-generation
task_ids: []
---
# Dataset Card for Magazine dataset
[](https://github.com/shunk031/huggingface-datasets_Magazine/actions/workflows/ci.yaml)
## Table of Contents
- [Dataset Card Creation Guide](#dataset-card-creation-guide)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://xtqiao.com/projects/content_aware_layout/
- **Repository:** https://github.com/shunk031/huggingface-datasets_Magazine
- **Paper (SIGGRAPH2019):** https://dl.acm.org/doi/10.1145/3306346.3322971
### Dataset Summary
A large-scale magazine layout dataset with fine-grained layout annotations and keyword labeling.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
To use Magazine dataset, you need to download the image and layout annotations from the [OneDrive](https://portland-my.sharepoint.com/:f:/g/personal/xqiao6-c_my_cityu_edu_hk/EhmRh5SFoQ9Hjl_aRjCOltkBKFYefiSagR6QLJ7pWvs3Ww?e=y8HO5Q) in the [official page](https://xtqiao.com/projects/content_aware_layout/).
Then place the downloaded files in the following structure and specify its path.
```shell
/path/to/datasets
├── MagImage.zip
└── MagLayout.zip
```
```python
import datasets as ds
dataset = ds.load_dataset(
path="shunk031/Magazine",
data_dir="/path/to/datasets/", # Specify the path of the downloaded directory.
)
```
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```bibtex
@article{zheng2019content,
title={Content-aware generative modeling of graphic design layouts},
author={Zheng, Xinru and Qiao, Xiaotian and Cao, Ying and Lau, Rynson WH},
journal={ACM Transactions on Graphics (TOG)},
volume={38},
number={4},
pages={1--15},
year={2019},
publisher={ACM New York, NY, USA}
}
```
### Contributions
Thanks to [Xinru Zheng and Xiaotian Qiao](https://xtqiao.com/projects/content_aware_layout/) for creating this dataset.
|
Sliden/mofu | ---
license: mit
---
|
autoevaluate/autoeval-eval-wmt19-de-en-04c9e1-2082967144 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- wmt19
eval_info:
task: translation
model: facebook/wmt19-en-de
metrics: []
dataset_name: wmt19
dataset_config: de-en
dataset_split: validation
col_mapping:
source: translation.en
target: translation.de
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Translation
* Model: facebook/wmt19-en-de
* Dataset: wmt19
* Config: de-en
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@WillHeld](https://huggingface.co/WillHeld) for evaluating this model. |
Diego1234/celeba | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': female
'1': male
splits:
- name: train
num_bytes: 2768237832.0
num_examples: 28000
- name: validation
num_bytes: 194932418.0
num_examples: 2000
download_size: 2963322017
dataset_size: 2963170250.0
---
# Dataset Card for "celeba"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-36bd0b51-8375121 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- scientific_papers
eval_info:
task: summarization
model: google/bigbird-pegasus-large-arxiv
metrics: ['bertscore', 'meteor']
dataset_name: scientific_papers
dataset_config: pubmed
dataset_split: test
col_mapping:
text: article
target: abstract
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/bigbird-pegasus-large-arxiv
* Dataset: scientific_papers
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Blaise_g](https://huggingface.co/Blaise_g) for evaluating this model. |
922-CA/lne2_09122023_test1 | ---
license: openrail
---
# Lora Negev (LLaMA2) 09122023 test 1
* Dataset of Negev dialogue from Girls' Frontline
* Manually edited to turn into multi-turn dialogue |
LCA-PORVID/frmt | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 327396
num_examples: 1970
download_size: 205774
dataset_size: 327396
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
minhalvp/islamqa | ---
license: apache-2.0
size_categories:
- 1K<n<10K
task_categories:
- text-generation
- conversational
dataset_info:
features:
- name: Question
dtype: string
- name: Full Answer
dtype: string
splits:
- name: train
num_bytes: 10259356
num_examples: 3355
download_size: 5425524
dataset_size: 10259356
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Data is scraped from https://islamqa.info/. Data contains islamic questions and its corrseponding answers from scholars.
Example:
| Question | Full Answer |
| -------- | ----------- |
| What should a person do if one of the areas normally washed during wudu is wounded? | Praise be to Allah. If a person is wounded in one of the areas of wudu and is not able to cover the wound with a bandage, he should perform wudu except for the wounded body part, for which he should then perform ablution with sand (tayammum). 1 (Note that this means one performs normal tayammum, not actually placing the sand on the wound itself.) 1 Al-Mughni ma'a al-Sharh il-Kabeer 1/282. | |
tempertrash/old_controlnet_dataset_1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: QR
dtype: image
splits:
- name: train
num_bytes: 621266164.648
num_examples: 6552
download_size: 621120599
dataset_size: 621266164.648
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
KameronB/SITCC-dataset | ---
license: mit
---
# Synthetic IT Call Center Dataset
## Description
This dataset comprises 9081 items, each an example of a potential sentence generated by an IT call center agent during the course of processing tickets. The responses are designed to span a broad spectrum of quality, from highly effective to poor, simulating real-world variations in agent performance. Each response has been evaluated and scored on a scale from 0 to 10 by a simulated call center manager, with the scores reflecting the perceived effectiveness of the response in addressing customer concerns. Additionally, each item includes a rationale provided by the simulated manager, offering insights into the scoring decision.
All items are synthetically generated by gpt-4. No actual call center data is used in this dataset.
## Dataset Structure
The dataset is structured as follows:
- **sentence**: The text of the sentence generated to simulate an IT call center agent's journaling during the course of processing a ticket.
- **ranking**: A numerical value ranging from 0 to 10, assigned by the simulated manager, indicating the quality of the response. A score of 0 indicates a poor sentnce, while a score of 10 represents a high quality sentence.
- **is_grammatically_correct**: A boolean value that is false if there are grammar errors detected and true if otherwise
- **reasoning**: A textual explanation provided by the simulated manager, detailing the reasons behind the assigned score.
## Use Cases
This dataset is intended for use in training and evaluating machine learning models designed to assess the quality of customer service responses. Potential applications include:
- Automated quality control systems for call centers, enabling real-time feedback to agents.
- Training tools for customer service representatives, offering examples of effective communication.
- Research in natural language processing, particularly in understanding nuances in customer service interactions.
## Future
I intend to increase the size of the dataset in the future |
CyberHarem/furisode_girl_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of furisode_girl (Pokémon)
This is the dataset of furisode_girl (Pokémon), containing 29 images and their tags.
The core tags of this character are `long_hair, brown_hair, breasts, blue_eyes, earrings, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 29 | 21.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/furisode_girl_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 29 | 17.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/furisode_girl_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 55 | 28.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/furisode_girl_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 29 | 20.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/furisode_girl_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 55 | 33.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/furisode_girl_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/furisode_girl_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, smile, collarbone, eyelashes, jewelry, kimono, nail_polish, poke_ball_(basic), wide_sleeves, holding_poke_ball, multiple_girls, orange_hair, sash, socks, blue_nails, closed_mouth, green_eyes, standing |
| 1 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, smile, brown_eyes, dark-skinned_female, hair_flower, nipples, barefoot, medium_breasts, nude, bare_shoulders, blush, navel, sitting, solo_focus, toes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | looking_at_viewer | smile | collarbone | eyelashes | jewelry | kimono | nail_polish | poke_ball_(basic) | wide_sleeves | holding_poke_ball | multiple_girls | orange_hair | sash | socks | blue_nails | closed_mouth | green_eyes | standing | brown_eyes | dark-skinned_female | hair_flower | nipples | barefoot | medium_breasts | nude | bare_shoulders | blush | navel | sitting | solo_focus | toes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------------|:--------|:-------------|:------------|:----------|:---------|:--------------|:--------------------|:---------------|:--------------------|:-----------------|:--------------|:-------|:--------|:-------------|:---------------|:-------------|:-----------|:-------------|:----------------------|:--------------|:----------|:-----------|:-----------------|:-------|:-----------------|:--------|:--------|:----------|:-------------|:-------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | X | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Anusha64/new11 | ---
license: mit
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-67000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1015219
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_stsb_dont | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1568
num_examples: 7
- name: test
num_bytes: 361
num_examples: 3
- name: train
num_bytes: 2257
num_examples: 16
download_size: 11264
dataset_size: 4186
---
# Dataset Card for "MULTI_VALUE_stsb_dont"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-inverse-scaling__NeQA-inverse-scaling__NeQA-1e740e-1694759584 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- inverse-scaling/NeQA
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-1.3b_eval
metrics: []
dataset_name: inverse-scaling/NeQA
dataset_config: inverse-scaling--NeQA
dataset_split: train
col_mapping:
text: prompt
classes: classes
target: answer_index
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-1.3b_eval
* Dataset: inverse-scaling/NeQA
* Config: inverse-scaling--NeQA
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@MicPie](https://huggingface.co/MicPie) for evaluating this model. |
CyberHarem/kuki_shinobu_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kuki_shinobu/久岐忍/久岐忍 (Genshin Impact)
This is the dataset of kuki_shinobu/久岐忍/久岐忍 (Genshin Impact), containing 500 images and their tags.
The core tags of this character are `green_hair, purple_eyes, ponytail, breasts, hair_between_eyes, hair_ornament, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:------------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1009.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuki_shinobu_genshin/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 840.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuki_shinobu_genshin/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1311 | 1.68 GiB | [Download](https://huggingface.co/datasets/CyberHarem/kuki_shinobu_genshin/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kuki_shinobu_genshin',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, black_shorts, crop_top, long_sleeves, looking_at_viewer, midriff, navel, ninja_mask, purple_jacket, shimenawa, short_shorts, solo, black_gloves, black_shirt, cropped_jacket, holding_dagger, stomach, thighhighs, partially_fingerless_gloves, sidelocks, thighs, cowboy_shot, fishnet_top, nail_polish, open_jacket, shoulder_armor |
| 1 | 6 |  |  |  |  |  | 1girl, black_gloves, black_shirt, black_shorts, crop_top, looking_at_viewer, midriff, navel, partially_fingerless_gloves, purple_jacket, rope, short_shorts, simple_background, solo, standing, stomach, white_background, cowboy_shot, cropped_jacket, hand_on_own_hip, long_sleeves, open_jacket, thighhighs, thighs, nail_polish, shoulder_armor, hand_up, ninja_mask, sidelocks |
| 2 | 20 |  |  |  |  |  | 1girl, purple_jacket, upper_body, ninja_mask, solo, looking_at_viewer, simple_background, white_background, hood, black_mask, black_shirt, open_clothes |
| 3 | 7 |  |  |  |  |  | 1girl, blush, hetero, nipples, penis, sex, solo_focus, spread_legs, 1boy, navel, open_mouth, purple_jacket, rope, vaginal, cropped_jacket, cum_in_pussy, female_pubic_hair, mosaic_censoring, open_jacket, pov, black_shirt, black_thighhighs, girl_on_top, large_breasts, looking_at_viewer, straddling, sweat |
| 4 | 19 |  |  |  |  |  | 1girl, solo, thighs, navel, outdoors, stomach, blush, looking_at_viewer, black_bikini, bare_shoulders, alternate_breast_size, blue_sky, cleavage, collarbone, huge_breasts, water, wet, day, rope, choker, beach, cloud, sidelocks, smile, ocean, short_ponytail |
| 5 | 6 |  |  |  |  |  | 1boy, 1girl, anus, ass_grab, black_mask, blush, hetero, penis, solo_focus, uncensored, vaginal, from_behind, indoors, ninja_mask, pussy, sweat, ass_focus, bottomless, looking_back, monitor, purple_jacket, closed_eyes, clothed_sex, cowgirl_position, girl_on_top, looking_at_viewer, nude, pubic_hair, tassel, window |
| 6 | 16 |  |  |  |  |  | looking_at_viewer, miko, 1girl, sidelocks, alternate_costume, long_sleeves, ribbon-trimmed_sleeves, wide_sleeves, solo, white_kimono, long_hair, red_hakama, hakama_skirt, closed_mouth, alternate_hairstyle, cherry_blossoms, holding, upper_body, hair_tubes, simple_background, white_background, blush |
| 7 | 5 |  |  |  |  |  | 1girl, hetero, saliva, 1boy, male_pubic_hair, mosaic_censoring, mouth_mask, solo_focus, black_mask, black_sailor_collar, half-closed_eyes, handjob, looking_at_viewer, pov, serafuku, sweat, barefoot, blush, clothed_female_nude_male, completely_nude, earrings, feet, from_side, licking_penis, open_mouth, standing, stray_pubic_hair, toes, tongue_out, uncensored |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_shorts | crop_top | long_sleeves | looking_at_viewer | midriff | navel | ninja_mask | purple_jacket | shimenawa | short_shorts | solo | black_gloves | black_shirt | cropped_jacket | holding_dagger | stomach | thighhighs | partially_fingerless_gloves | sidelocks | thighs | cowboy_shot | fishnet_top | nail_polish | open_jacket | shoulder_armor | rope | simple_background | standing | white_background | hand_on_own_hip | hand_up | upper_body | hood | black_mask | open_clothes | blush | hetero | nipples | penis | sex | solo_focus | spread_legs | 1boy | open_mouth | vaginal | cum_in_pussy | female_pubic_hair | mosaic_censoring | pov | black_thighhighs | girl_on_top | large_breasts | straddling | sweat | outdoors | black_bikini | bare_shoulders | alternate_breast_size | blue_sky | cleavage | collarbone | huge_breasts | water | wet | day | choker | beach | cloud | smile | ocean | short_ponytail | anus | ass_grab | uncensored | from_behind | indoors | pussy | ass_focus | bottomless | looking_back | monitor | closed_eyes | clothed_sex | cowgirl_position | nude | pubic_hair | tassel | window | miko | alternate_costume | ribbon-trimmed_sleeves | wide_sleeves | white_kimono | long_hair | red_hakama | hakama_skirt | closed_mouth | alternate_hairstyle | cherry_blossoms | holding | hair_tubes | saliva | male_pubic_hair | mouth_mask | black_sailor_collar | half-closed_eyes | handjob | serafuku | barefoot | clothed_female_nude_male | completely_nude | earrings | feet | from_side | licking_penis | stray_pubic_hair | toes | tongue_out |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-----------|:---------------|:--------------------|:----------|:--------|:-------------|:----------------|:------------|:---------------|:-------|:---------------|:--------------|:-----------------|:-----------------|:----------|:-------------|:------------------------------|:------------|:---------|:--------------|:--------------|:--------------|:--------------|:-----------------|:-------|:--------------------|:-----------|:-------------------|:------------------|:----------|:-------------|:-------|:-------------|:---------------|:--------|:---------|:----------|:--------|:------|:-------------|:--------------|:-------|:-------------|:----------|:---------------|:--------------------|:-------------------|:------|:-------------------|:--------------|:----------------|:-------------|:--------|:-----------|:---------------|:-----------------|:------------------------|:-----------|:-----------|:-------------|:---------------|:--------|:------|:------|:---------|:--------|:--------|:--------|:--------|:-----------------|:-------|:-----------|:-------------|:--------------|:----------|:--------|:------------|:-------------|:---------------|:----------|:--------------|:--------------|:-------------------|:-------|:-------------|:---------|:---------|:-------|:--------------------|:-------------------------|:---------------|:---------------|:------------|:-------------|:---------------|:---------------|:----------------------|:------------------|:----------|:-------------|:---------|:------------------|:-------------|:----------------------|:-------------------|:----------|:-----------|:-----------|:---------------------------|:------------------|:-----------|:-------|:------------|:----------------|:-------------------|:-------|:-------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 20 |  |  |  |  |  | X | | | | X | | | X | X | | | X | | X | | | | | | | | | | | | | | X | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | | | X | | X | | X | | | | | X | X | | | | | | | | | | X | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 19 |  |  |  |  |  | X | | | | X | | X | | | | | X | | | | | X | | | X | X | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | | X | | X | | X | | X | | | | | | X | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 16 |  |  |  |  |  | X | | | X | X | | | | | | | X | | | | | | | | X | | | | | | | | X | | X | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | X | X | | | | X | | X | X | | | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
sil-ai/bloom-lm | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- afr
- af
- aaa
- abc
- ada
- adq
- aeu
- agq
- ags
- ahk
- aia
- ajz
- aka
- ak
- ame
- amh
- am
- amp
- amu
- ann
- aph
- awa
- awb
- azn
- azo
- bag
- bam
- bm
- baw
- bax
- bbk
- bcc
- bce
- bec
- bef
- ben
- bn
- bfd
- bfm
- bfn
- bgf
- bho
- bhs
- bis
- bi
- bjn
- bjr
- bkc
- bkh
- bkm
- bkx
- bob
- bod
- bo
- boz
- bqm
- bra
- brb
- bri
- brv
- bss
- bud
- buo
- bwt
- bwx
- bxa
- bya
- bze
- bzi
- cak
- cbr
- ceb
- cgc
- chd
- chp
- cim
- clo
- cmn
- zh
- cmo
- csw
- cuh
- cuv
- dag
- ddg
- ded
- deu
- de
- dig
- dje
- dmg
- dnw
- dtp
- dtr
- dty
- dug
- eee
- ekm
- enb
- enc
- eng
- en
- ewo
- fas
- fa
- fil
- fli
- fon
- fra
- fr
- fub
- fuh
- gal
- gbj
- gou
- gsw
- guc
- guj
- gu
- guz
- gwc
- hao
- hat
- ht
- hau
- ha
- hbb
- hig
- hil
- hin
- hi
- hla
- hna
- hre
- hro
- idt
- ilo
- ind
- id
- ino
- isu
- ita
- it
- jgo
- jmx
- jpn
- ja
- jra
- kak
- kam
- kan
- kn
- kau
- kr
- kbq
- kbx
- kby
- kek
- ken
- khb
- khm
- km
- kik
- ki
- kin
- rw
- kir
- ky
- kjb
- kmg
- kmr
- ku
- kms
- kmu
- kor
- ko
- kqr
- krr
- ksw
- kur
- ku
- kvt
- kwd
- kwu
- kwx
- kxp
- kyq
- laj
- lan
- lao
- lo
- lbr
- lfa
- lgg
- lgr
- lhm
- lhu
- lkb
- llg
- lmp
- lns
- loh
- lsi
- lts
- lug
- lg
- luy
- lwl
- mai
- mal
- ml
- mam
- mar
- mr
- mdr
- mfh
- mfj
- mgg
- mgm
- mgo
- mgq
- mhx
- miy
- mkz
- mle
- mlk
- mlw
- mmu
- mne
- mnf
- mnw
- mot
- mqj
- mrn
- mry
- msb
- muv
- mve
- mxu
- mya
- my
- myk
- myx
- mzm
- nas
- nco
- nep
- ne
- new
- nge
- ngn
- nhx
- njy
- nla
- nld
- nl
- nlv
- nod
- nsk
- nsn
- nso
- nst
- nuj
- nwe
- nwi
- nxa
- nxl
- nya
- ny
- nyo
- nyu
- nza
- odk
- oji
- oj
- oki
- omw
- ori
- or
- ozm
- pae
- pag
- pan
- pa
- pbt
- pce
- pcg
- pdu
- pea
- pex
- pis
- pkb
- pmf
- pnz
- por
- pt
- psp
- pwg
- qaa
- qub
- quc
- quf
- quz
- qve
- qvh
- qvm
- qvo
- qxh
- rel
- rnl
- ron
- ro
- roo
- rue
- rug
- rus
- ru
- san
- sa
- saq
- sat
- sdk
- sea
- sgd
- shn
- sml
- snk
- snl
- som
- so
- sot
- st
- sox
- spa
- es
- sps
- ssn
- stk
- swa
- sw
- swh
- sxb
- syw
- taj
- tam
- ta
- tbj
- tdb
- tdg
- tdt
- teo
- tet
- tgk
- tg
- tha
- th
- the
- thk
- thl
- thy
- tio
- tkd
- tnl
- tnn
- tnp
- tnt
- tod
- tom
- tpi
- tpl
- tpu
- tsb
- tsn
- tn
- tso
- ts
- tuv
- tuz
- tvs
- udg
- unr
- urd
- ur
- uzb
- uz
- ven
- ve
- vie
- vi
- vif
- war
- wbm
- wbr
- wms
- wni
- wnk
- wtk
- xho
- xh
- xkg
- xmd
- xmg
- xmm
- xog
- xty
- yas
- yav
- ybb
- ybh
- ybi
- ydd
- yea
- yet
- yid
- yi
- yin
- ymp
- zaw
- zho
- zh
- zlm
- zuh
- zul
- zu
license:
- cc-by-4.0
- cc-by-nc-4.0
- cc-by-nd-4.0
- cc-by-sa-4.0
- cc-by-nc-nd-4.0
- cc-by-nc-sa-4.0
multilinguality:
- multilingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_ids:
- language-modeling
paperswithcode_id: null
pretty_name: BloomLM
extra_gated_prompt: |-
One more step before getting this dataset. This dataset is open access and available only for non-commercial use (except for portions of the dataset labeled with a `cc-by-sa` license). A "license" field paired with each of the dataset entries/samples specifies the Creative Commons license for that entry/sample.
These [Creative Commons licenses](https://creativecommons.org/about/cclicenses/) specify that:
1. You cannot use the dataset for or directed toward commercial advantage or monetary compensation (except for those portions of the dataset labeled specifically with a `cc-by-sa` license. If you would like to ask about commercial uses of this dataset, please [email us](mailto:sj@derivation.co).
2. Any public, non-commercial use of the data must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
3. For those portions of the dataset marked with an ND license, you cannot remix, transform, or build upon the material, and you may not distribute modified material.
In addition to the above implied by Creative Commons and when clicking "Access Repository" below, you agree:
1. Not to use the dataset for any use intended to or which has the effect of harming or enabling discrimination against individuals or groups based on legally protected characteristics or categories, including but not limited to discrimination against Indigenous People as outlined in Articles 2; 13-16; and 31 of the United Nations Declaration on the Rights of Indigenous People, 13 September 2007 and as subsequently amended and revised.
2. That your *contact information* (email address and username) can be shared with the model authors as well.
extra_gated_fields:
I have read the License and agree with its terms: checkbox
---
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
<!-- - [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions) -->
## Dataset Description
- **Homepage:** [SIL AI](https://ai.sil.org/)
- **Point of Contact:** [SIL AI email](mailto:idx_aqua@sil.org)
- **Source Data:** [Bloom Library](https://bloomlibrary.org/)
 
## Dataset Summary
**Bloom** is free, open-source software and an associated website [Bloom Library](https://bloomlibrary.org/), app, and services developed by [SIL International](https://www.sil.org/). Bloom’s primary goal is to equip non-dominant language communities and their members to create the literature they want for their community and children. Bloom also serves organizations that help such communities develop literature and education or other aspects of community development.
This version of the Bloom Library data is developed specifically for the language modeling task. It includes data from 364 languages across 31 language families. There is a mean of 32 stories and median of 2 stories per language.
**Note**: If you speak one of these languages and can help provide feedback or corrections, please let us know!
**Note**: Although this data was used in the training of the [BLOOM model](https://huggingface.co/bigscience/bloom), this dataset only represents a small portion of the data used to train that model. Data from "Bloom Library" was combined with a large number of other datasets to train that model. "Bloom Library" is a project that existed prior to the BLOOM model, and is something separate. All that to say... We were using the "Bloom" name before it was cool. 😉
## Languages
Of the 500+ languages listed at BloomLibrary.org, there are 363 languages available in this dataset. Here are the corresponding ISO 639-3 codes:
aaa, abc, ada, adq, aeu, afr, agq, ags, ahk, aia, ajz, aka, ame, amh, amp, amu, ann, aph, awa, awb, azn, azo, bag, bam, baw, bax, bbk, bcc, bce, bec, bef, ben, bfd, bfm, bfn, bgf, bho, bhs, bis, bjn, bjr, bkc, bkh, bkm, bkx, bob, bod, boz, bqm, bra, brb, bri, brv, bss, bud, buo, bwt, bwx, bxa, bya, bze, bzi, cak, cbr, ceb, cgc, chd, chp, cim, clo, cmn, cmo, csw, cuh, cuv, dag, ddg, ded, deu, dig, dje, dmg, dnw, dtp, dtr, dty, dug, eee, ekm, enb, enc, eng, ewo, fas, fil, fli, fon, fra, fub, fuh, gal, gbj, gou, gsw, guc, guj, guz, gwc, hao, hat, hau, hbb, hig, hil, hin, hla, hna, hre, hro, idt, ilo, ind, ino, isu, ita, jgo, jmx, jpn, jra, kak, kam, kan, kau, kbq, kbx, kby, kek, ken, khb, khm, kik, kin, kir, kjb, kmg, kmr, kms, kmu, kor, kqr, krr, ksw, kur, kvt, kwd, kwu, kwx, kxp, kyq, laj, lan, lao, lbr, lfa, lgg, lgr, lhm, lhu, lkb, llg, lmp, lns, loh, lsi, lts, lug, luy, lwl, mai, mal, mam, mar, mdr, mfh, mfj, mgg, mgm, mgo, mgq, mhx, miy, mkz, mle, mlk, mlw, mmu, mne, mnf, mnw, mot, mqj, mrn, mry, msb, muv, mve, mxu, mya, myk, myx, mzm, nas, nco, nep, new, nge, ngn, nhx, njy, nla, nld, nlv, nod, nsk, nsn, nso, nst, nuj, nwe, nwi, nxa, nxl, nya, nyo, nyu, nza, odk, oji, oki, omw, ori, ozm, pae, pag, pan, pbt, pce, pcg, pdu, pea, pex, pis, pkb, pmf, pnz, por, psp, pwg, qub, quc, quf, quz, qve, qvh, qvm, qvo, qxh, rel, rnl, ron, roo, rue, rug, rus, san, saq, sat, sdk, sea, sgd, shn, sml, snk, snl, som, sot, sox, spa, sps, ssn, stk, swa, swh, sxb, syw, taj, tam, tbj, tdb, tdg, tdt, teo, tet, tgk, tha, the, thk, thl, thy, tio, tkd, tnl, tnn, tnp, tnt, tod, tom, tpi, tpl, tpu, tsb, tsn, tso, tuv, tuz, tvs, udg, unr, urd, uzb, ven, vie, vif, war, wbm, wbr, wms, wni, wnk, wtk, xho, xkg, xmd, xmg, xmm, xog, xty, yas, yav, ybb, ybh, ybi, ydd, yea, yet, yid, yin, ymp, zaw, zho, zlm, zuh, zul
## Dataset Statistics
Some of the languages included in the dataset just include 1 or a couple of "stories." These are not split between training, validation, and test. For those with higher numbers of available stories we include the following numbers of stories in each split:
| ISO 639-3 | Name | Train Stories | Validation Stories | Test Stories |
|:------------|:------------------------------|----------------:|---------------------:|---------------:|
| aeu | Akeu | 47 | 6 | 5 |
| afr | Afrikaans | 19 | 2 | 2 |
| ahk | Akha | 81 | 10 | 10 |
| aph | Athpariya | 28 | 4 | 3 |
| awa | Awadhi | 131 | 16 | 16 |
| ben | Bengali | 201 | 25 | 25 |
| bfn | Bunak | 11 | 1 | 1 |
| bho | Bhojpuri | 139 | 17 | 17 |
| bis | Bislama | 20 | 2 | 2 |
| bkm | Kom (Cameroon) | 15 | 2 | 1 |
| bkx | Baikeno | 8 | 1 | 1 |
| brb | Brao | 18 | 2 | 2 |
| bwx | Bu-Nao Bunu | 14 | 2 | 1 |
| bzi | Bisu | 53 | 7 | 6 |
| cak | Kaqchikel | 54 | 7 | 6 |
| cbr | Cashibo-Cacataibo | 11 | 1 | 1 |
| ceb | Cebuano | 335 | 42 | 41 |
| cgc | Kagayanen | 158 | 20 | 19 |
| cmo | Central Mnong | 16 | 2 | 2 |
| ddg | Fataluku | 14 | 2 | 1 |
| deu | German | 36 | 4 | 4 |
| dtp | Kadazan Dusun | 13 | 2 | 1 |
| dty | Dotyali | 138 | 17 | 17 |
| eng | English | 2107 | 263 | 263 |
| fas | Persian | 104 | 13 | 12 |
| fil | Filipino | 55 | 7 | 6 |
| fra | French | 323 | 40 | 40 |
| gal | Galolen | 11 | 1 | 1 |
| gwc | Gawri | 15 | 2 | 1 |
| hat | Haitian | 208 | 26 | 26 |
| hau | Hausa | 205 | 26 | 25 |
| hbb | Huba | 22 | 3 | 2 |
| hin | Hindi | 16 | 2 | 2 |
| idt | Idaté | 8 | 1 | 1 |
| ind | Indonesian | 208 | 26 | 25 |
| jmx | Western Juxtlahuaca Mixtec | 19 | 2 | 2 |
| jra | Jarai | 112 | 14 | 13 |
| kak | Kalanguya | 156 | 20 | 19 |
| kan | Kannada | 17 | 2 | 2 |
| kau | Kanuri | 36 | 5 | 4 |
| kek | Kekchí | 29 | 4 | 3 |
| khb | Lü | 25 | 3 | 3 |
| khm | Khmer | 28 | 4 | 3 |
| kik | Kikuyu | 8 | 1 | 1 |
| kir | Kirghiz | 306 | 38 | 38 |
| kjb | Q'anjob'al | 82 | 10 | 10 |
| kmg | Kâte | 16 | 2 | 1 |
| kor | Korean | 106 | 13 | 13 |
| krr | Krung | 24 | 3 | 3 |
| kwd | Kwaio | 19 | 2 | 2 |
| kwu | Kwakum | 16 | 2 | 2 |
| lbr | Lohorung | 8 | 1 | 1 |
| lhu | Lahu | 32 | 4 | 4 |
| lsi | Lashi | 21 | 3 | 2 |
| mai | Maithili | 144 | 18 | 18 |
| mal | Malayalam | 12 | 1 | 1 |
| mam | Mam | 108 | 13 | 13 |
| mar | Marathi | 8 | 1 | 1 |
| mgm | Mambae | 12 | 2 | 1 |
| mhx | Maru | 79 | 10 | 9 |
| mkz | Makasae | 16 | 2 | 2 |
| mya | Burmese | 31 | 4 | 3 |
| myk | Mamara Senoufo | 28 | 3 | 3 |
| nep | Nepali (macrolanguage) | 160 | 20 | 20 |
| new | Newari | 142 | 18 | 17 |
| nlv | Orizaba Nahuatl | 8 | 1 | 1 |
| nsn | Nehan | 9 | 1 | 1 |
| nwi | Southwest Tanna | 9 | 1 | 1 |
| nxa | Nauete | 12 | 1 | 1 |
| omw | South Tairora | 10 | 1 | 1 |
| pbt | Southern Pashto | 164 | 21 | 20 |
| pce | Ruching Palaung | 30 | 4 | 3 |
| pis | Pijin | 14 | 2 | 1 |
| por | Portuguese | 131 | 16 | 16 |
| quc | K'iche' | 80 | 10 | 9 |
| rus | Russian | 283 | 35 | 35 |
| sdk | Sos Kundi | 9 | 1 | 1 |
| snk | Soninke | 28 | 4 | 3 |
| spa | Spanish | 423 | 53 | 52 |
| swh | Swahili (individual language) | 58 | 7 | 7 |
| tam | Tamil | 13 | 2 | 1 |
| tdg | Western Tamang | 26 | 3 | 3 |
| tdt | Tetun Dili | 22 | 3 | 2 |
| tet | Tetum | 8 | 1 | 1 |
| tgk | Tajik | 24 | 3 | 2 |
| tha | Thai | 228 | 29 | 28 |
| the | Chitwania Tharu | 11 | 1 | 1 |
| thl | Dangaura Tharu | 148 | 19 | 18 |
| tnl | Lenakel | 10 | 1 | 1 |
| tnn | North Tanna | 9 | 1 | 1 |
| tpi | Tok Pisin | 161 | 20 | 20 |
| tpu | Tampuan | 24 | 3 | 2 |
| uzb | Uzbek | 24 | 3 | 2 |
| war | Waray (Philippines) | 16 | 2 | 2 |
| wbr | Wagdi | 10 | 1 | 1 |
| wni | Ndzwani Comorian | 12 | 2 | 1 |
| xkg | Kagoro | 16 | 2 | 1 |
| ybh | Yakha | 16 | 2 | 1 |
| zho | Chinese | 34 | 4 | 4 |
| zlm | Malay (individual language) | 8 | 1 | 1 |
| zul | Zulu | 19 | 2 | 2 |
## Dataset Structure
### Data Instances
The examples look like this for Hindi:
```
from datasets import load_dataset
# Specify the language code.
dataset = load_dataset("sil-ai/bloom-lm", 'hin')
# A data point consists of stories in the specified language code.
# To see a story:
print(dataset['train']['text'][0])
```
This would produce an output:
```
साबू ने एक कंकड़ को ठोकर मारी। कंकड़ लुढ़कता हुआ एक पेड़ के पास पहुँचा। पेड़ के तने पर मुलायम बाल थे। साबू ने छुए और ऊपर देखा, ऊपर, ऊपर और उससे भी ऊपर...दो आँखें नीचे देख रही थीं।
“हेलो, तुम कौन हो?” साबू को बड़ा अचम्भा हुआ।“हेलो, मैं जिराफ़ हूँ। मेरा नाम है जोजो। मैं तुम्हारे साथ खेल सकता हूँ। मेरी पीठ पर चढ़ जाओ, मैं तुम्हें घुमा के लाता हूँ।”
साबू जोजो की पीठ पर चढ़ गया और वे सड़क पर चल निकले। फिर पहाड़ी पर और शहर के बीचों बीच।
साबू खुशी से चिल्लाया, “जोजो दाएँ मुड़ो,
बाएँ मुड़ो और फिर दाएँ।” अब वे उसकी दोस्त मुन्नी के घर पहुँच गये।
आज मुन्नी का जन्मदिन था। साबू को जोजो पर सवारी करते देख बच्चों ने ताली बजायी।
जोजो ने गुब्बारे लटकाने में आन्टी की मदद करी क्योंकि वह इतना... लम्बा था।
कितना आसान था!
जोजो ने सब बच्चों को सवारी कराई।
उनके साथ बॉल भी खेली। बड़े मज़े की पार्टी थी।सब ने गाया, “हैप्पी बर्थ डे टु यू ।”
आन्टी ने मेज़ पर समोसे, गुलाब जामुन और आइसक्रीम सजाई।
जोजो को आइसक्रीम बहुत पसन्द आई। अंकल उसके लिये एक बाल्टी भर के आइसक्रीम लाये। जोजो ने पूरी बाल्टी ख़त्म कर दी। अब घर जाने का समय हो गया।
सब ने कहा, “बाय बाय जोजो, बाय बाय साबू।” साबू और जोजो घर लौटे।
```
Whereas if you wish to gather all the text for a language you may use this:
```
dataset['train']['text']
```
### Data Fields
The metadata fields below are available and the full dataset will be updated with per story metadata soon (in August 2022). As of now a majority of stories have metadata, but some are missing certain fields. In terms of licenses, all stories included in the current release are released under a Creative Commons license (even if the individual story metadata fields are missing).
- **text**: the text of the story/book, concatenated together from the different pages.
- **id**: id of the sample
- **title**: title of the book, e.g. "Going to Buy a Book".
- **license**: specific license used, e.g. "cc-by-sa" for "Creative Commons, by attribution, share-alike".
- **copyright**: copyright notice from the original book on bloomlibrary.org
- **pageCount**: page count from the metadata on the original book on bloomlibrary.org.
- **bookInstanceId**: unique ID for each book/translation assigned by Bloom. For example the Hindi version of 'Going to Buy a Book' is 'af86eefd-f69c-4e06-b8eb-e0451853aab9'.
- **bookLineage**: Unique bookInstanceIDs of _other_ Bloom books that this book is in some way based on. For example, the Hindi version in the example above is based on '056B6F11-4A6C-4942-B2BC-8861E62B03B3'. It's quite possible for this to be either empty, or have multiple entries. For example, the book 'Saboo y Jojo' with ID '5b232a5f-561d-4514-afe7-d6ed2f6a940f' is based on two others, ['056B6F11-4A6C-4942-B2BC-8861E62B03B3', '10a6075b-3c4f-40e4-94f3-593497f2793a']
- (coming soon) **contentLanguages**: Other languages this book may be available in. "Going to Buy a Book" is available in ['eng', 'kan', 'mar', 'pan', 'ben', 'guj', 'hin'] for example.
### Data Splits
All languages include a train, validation, and test split. However, for language having a small number of stories, certain of these splits maybe empty. In such cases, we recommend using any data for testing only or for zero-shot experiments.
## Changelog
- **25 August 2022** - add the remaining metadata, change data type of `pageCount` to int32
- **24 August 2022** - majority of metadata added back in to the filtered/ clean data
- **23 August 2022** - metadata temporarily removed to update to cleaner dataset |
jacobbieker/gdas-kerchunk | ---
license: mit
---
|
open-llm-leaderboard/details_Zangs3011__codellama_7b_DolphinCoder | ---
pretty_name: Evaluation run of Zangs3011/codellama_7b_DolphinCoder
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Zangs3011/codellama_7b_DolphinCoder](https://huggingface.co/Zangs3011/codellama_7b_DolphinCoder)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Zangs3011__codellama_7b_DolphinCoder\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-23T16:19:11.298968](https://huggingface.co/datasets/open-llm-leaderboard/details_Zangs3011__codellama_7b_DolphinCoder/blob/main/results_2023-12-23T16-19-11.298968.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3827239446273333,\n\
\ \"acc_stderr\": 0.034226432114737984,\n \"acc_norm\": 0.3863708183260275,\n\
\ \"acc_norm_stderr\": 0.03501715050425477,\n \"mc1\": 0.22031823745410037,\n\
\ \"mc1_stderr\": 0.014509045171487291,\n \"mc2\": 0.35450592505891126,\n\
\ \"mc2_stderr\": 0.014292262562897113\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.39761092150170646,\n \"acc_stderr\": 0.014301752223279536,\n\
\ \"acc_norm\": 0.4197952218430034,\n \"acc_norm_stderr\": 0.014422181226303026\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.49432383987253536,\n\
\ \"acc_stderr\": 0.004989459871609184,\n \"acc_norm\": 0.6550487950607449,\n\
\ \"acc_norm_stderr\": 0.004743808792037848\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.34868421052631576,\n \"acc_stderr\": 0.038781398887976104,\n\
\ \"acc_norm\": 0.34868421052631576,\n \"acc_norm_stderr\": 0.038781398887976104\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.33962264150943394,\n \"acc_stderr\": 0.02914690474779833,\n\
\ \"acc_norm\": 0.33962264150943394,\n \"acc_norm_stderr\": 0.02914690474779833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.3236994219653179,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364397,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364397\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2978723404255319,\n \"acc_stderr\": 0.029896145682095455,\n\
\ \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.029896145682095455\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643898,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643898\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.36774193548387096,\n \"acc_stderr\": 0.02743086657997347,\n \"\
acc_norm\": 0.36774193548387096,\n \"acc_norm_stderr\": 0.02743086657997347\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.24630541871921183,\n \"acc_stderr\": 0.030315099285617715,\n \"\
acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.030315099285617715\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4121212121212121,\n \"acc_stderr\": 0.03843566993588717,\n\
\ \"acc_norm\": 0.4121212121212121,\n \"acc_norm_stderr\": 0.03843566993588717\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.46464646464646464,\n \"acc_stderr\": 0.03553436368828063,\n \"\
acc_norm\": 0.46464646464646464,\n \"acc_norm_stderr\": 0.03553436368828063\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.45595854922279794,\n \"acc_stderr\": 0.035944137112724366,\n\
\ \"acc_norm\": 0.45595854922279794,\n \"acc_norm_stderr\": 0.035944137112724366\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.36153846153846153,\n \"acc_stderr\": 0.024359581465396987,\n\
\ \"acc_norm\": 0.36153846153846153,\n \"acc_norm_stderr\": 0.024359581465396987\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228402,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228402\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.3739495798319328,\n \"acc_stderr\": 0.031429466378837076,\n\
\ \"acc_norm\": 0.3739495798319328,\n \"acc_norm_stderr\": 0.031429466378837076\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073328,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073328\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.48073394495412847,\n \"acc_stderr\": 0.021421402982548878,\n \"\
acc_norm\": 0.48073394495412847,\n \"acc_norm_stderr\": 0.021421402982548878\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4117647058823529,\n \"acc_stderr\": 0.034542365853806094,\n \"\
acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.034542365853806094\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4810126582278481,\n \"acc_stderr\": 0.03252375148090448,\n \
\ \"acc_norm\": 0.4810126582278481,\n \"acc_norm_stderr\": 0.03252375148090448\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4080717488789238,\n\
\ \"acc_stderr\": 0.03298574607842821,\n \"acc_norm\": 0.4080717488789238,\n\
\ \"acc_norm_stderr\": 0.03298574607842821\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.33587786259541985,\n \"acc_stderr\": 0.04142313771996665,\n\
\ \"acc_norm\": 0.33587786259541985,\n \"acc_norm_stderr\": 0.04142313771996665\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4380165289256198,\n \"acc_stderr\": 0.045291468044357915,\n \"\
acc_norm\": 0.4380165289256198,\n \"acc_norm_stderr\": 0.045291468044357915\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3987730061349693,\n \"acc_stderr\": 0.03847021420456026,\n\
\ \"acc_norm\": 0.3987730061349693,\n \"acc_norm_stderr\": 0.03847021420456026\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.47572815533980584,\n \"acc_stderr\": 0.049449010929737795,\n\
\ \"acc_norm\": 0.47572815533980584,\n \"acc_norm_stderr\": 0.049449010929737795\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5726495726495726,\n\
\ \"acc_stderr\": 0.032408473935163266,\n \"acc_norm\": 0.5726495726495726,\n\
\ \"acc_norm_stderr\": 0.032408473935163266\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4482758620689655,\n\
\ \"acc_stderr\": 0.017784034534992433,\n \"acc_norm\": 0.4482758620689655,\n\
\ \"acc_norm_stderr\": 0.017784034534992433\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3988439306358382,\n \"acc_stderr\": 0.026362437574546545,\n\
\ \"acc_norm\": 0.3988439306358382,\n \"acc_norm_stderr\": 0.026362437574546545\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331154,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331154\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.02845263998508801,\n\
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.02845263998508801\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4180064308681672,\n\
\ \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.4180064308681672,\n\
\ \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.027586006221607715,\n\
\ \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.027586006221607715\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.30141843971631205,\n \"acc_stderr\": 0.027374128882631146,\n \
\ \"acc_norm\": 0.30141843971631205,\n \"acc_norm_stderr\": 0.027374128882631146\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.303129074315515,\n\
\ \"acc_stderr\": 0.011738669951254293,\n \"acc_norm\": 0.303129074315515,\n\
\ \"acc_norm_stderr\": 0.011738669951254293\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.35661764705882354,\n \"acc_stderr\": 0.029097209568411955,\n\
\ \"acc_norm\": 0.35661764705882354,\n \"acc_norm_stderr\": 0.029097209568411955\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.35294117647058826,\n \"acc_stderr\": 0.01933314202079706,\n \
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.01933314202079706\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4727272727272727,\n\
\ \"acc_stderr\": 0.04782001791380063,\n \"acc_norm\": 0.4727272727272727,\n\
\ \"acc_norm_stderr\": 0.04782001791380063\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.40816326530612246,\n \"acc_stderr\": 0.03146465712827424,\n\
\ \"acc_norm\": 0.40816326530612246,\n \"acc_norm_stderr\": 0.03146465712827424\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5124378109452736,\n\
\ \"acc_stderr\": 0.03534439848539579,\n \"acc_norm\": 0.5124378109452736,\n\
\ \"acc_norm_stderr\": 0.03534439848539579\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.35542168674698793,\n\
\ \"acc_stderr\": 0.03726214354322415,\n \"acc_norm\": 0.35542168674698793,\n\
\ \"acc_norm_stderr\": 0.03726214354322415\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.47953216374269003,\n \"acc_stderr\": 0.038316105328219316,\n\
\ \"acc_norm\": 0.47953216374269003,\n \"acc_norm_stderr\": 0.038316105328219316\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22031823745410037,\n\
\ \"mc1_stderr\": 0.014509045171487291,\n \"mc2\": 0.35450592505891126,\n\
\ \"mc2_stderr\": 0.014292262562897113\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6361483820047356,\n \"acc_stderr\": 0.013521488896883408\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09704321455648218,\n \
\ \"acc_stderr\": 0.008153768274554735\n }\n}\n```"
repo_url: https://huggingface.co/Zangs3011/codellama_7b_DolphinCoder
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|arc:challenge|25_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|gsm8k|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hellaswag|10_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T16-19-11.298968.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-23T16-19-11.298968.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- '**/details_harness|winogrande|5_2023-12-23T16-19-11.298968.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-23T16-19-11.298968.parquet'
- config_name: results
data_files:
- split: 2023_12_23T16_19_11.298968
path:
- results_2023-12-23T16-19-11.298968.parquet
- split: latest
path:
- results_2023-12-23T16-19-11.298968.parquet
---
# Dataset Card for Evaluation run of Zangs3011/codellama_7b_DolphinCoder
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Zangs3011/codellama_7b_DolphinCoder](https://huggingface.co/Zangs3011/codellama_7b_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Zangs3011__codellama_7b_DolphinCoder",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T16:19:11.298968](https://huggingface.co/datasets/open-llm-leaderboard/details_Zangs3011__codellama_7b_DolphinCoder/blob/main/results_2023-12-23T16-19-11.298968.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3827239446273333,
"acc_stderr": 0.034226432114737984,
"acc_norm": 0.3863708183260275,
"acc_norm_stderr": 0.03501715050425477,
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487291,
"mc2": 0.35450592505891126,
"mc2_stderr": 0.014292262562897113
},
"harness|arc:challenge|25": {
"acc": 0.39761092150170646,
"acc_stderr": 0.014301752223279536,
"acc_norm": 0.4197952218430034,
"acc_norm_stderr": 0.014422181226303026
},
"harness|hellaswag|10": {
"acc": 0.49432383987253536,
"acc_stderr": 0.004989459871609184,
"acc_norm": 0.6550487950607449,
"acc_norm_stderr": 0.004743808792037848
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.34868421052631576,
"acc_stderr": 0.038781398887976104,
"acc_norm": 0.34868421052631576,
"acc_norm_stderr": 0.038781398887976104
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.33962264150943394,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.33962264150943394,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364397,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364397
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.029896145682095455,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.029896145682095455
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643898,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.36774193548387096,
"acc_stderr": 0.02743086657997347,
"acc_norm": 0.36774193548387096,
"acc_norm_stderr": 0.02743086657997347
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.24630541871921183,
"acc_stderr": 0.030315099285617715,
"acc_norm": 0.24630541871921183,
"acc_norm_stderr": 0.030315099285617715
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4121212121212121,
"acc_stderr": 0.03843566993588717,
"acc_norm": 0.4121212121212121,
"acc_norm_stderr": 0.03843566993588717
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.46464646464646464,
"acc_stderr": 0.03553436368828063,
"acc_norm": 0.46464646464646464,
"acc_norm_stderr": 0.03553436368828063
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.45595854922279794,
"acc_stderr": 0.035944137112724366,
"acc_norm": 0.45595854922279794,
"acc_norm_stderr": 0.035944137112724366
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36153846153846153,
"acc_stderr": 0.024359581465396987,
"acc_norm": 0.36153846153846153,
"acc_norm_stderr": 0.024359581465396987
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228402,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228402
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3739495798319328,
"acc_stderr": 0.031429466378837076,
"acc_norm": 0.3739495798319328,
"acc_norm_stderr": 0.031429466378837076
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073328,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073328
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.48073394495412847,
"acc_stderr": 0.021421402982548878,
"acc_norm": 0.48073394495412847,
"acc_norm_stderr": 0.021421402982548878
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.034542365853806094,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.034542365853806094
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4810126582278481,
"acc_stderr": 0.03252375148090448,
"acc_norm": 0.4810126582278481,
"acc_norm_stderr": 0.03252375148090448
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4080717488789238,
"acc_stderr": 0.03298574607842821,
"acc_norm": 0.4080717488789238,
"acc_norm_stderr": 0.03298574607842821
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.33587786259541985,
"acc_stderr": 0.04142313771996665,
"acc_norm": 0.33587786259541985,
"acc_norm_stderr": 0.04142313771996665
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4380165289256198,
"acc_stderr": 0.045291468044357915,
"acc_norm": 0.4380165289256198,
"acc_norm_stderr": 0.045291468044357915
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3987730061349693,
"acc_stderr": 0.03847021420456026,
"acc_norm": 0.3987730061349693,
"acc_norm_stderr": 0.03847021420456026
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.47572815533980584,
"acc_stderr": 0.049449010929737795,
"acc_norm": 0.47572815533980584,
"acc_norm_stderr": 0.049449010929737795
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5726495726495726,
"acc_stderr": 0.032408473935163266,
"acc_norm": 0.5726495726495726,
"acc_norm_stderr": 0.032408473935163266
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.017784034534992433,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.017784034534992433
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.026362437574546545,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.026362437574546545
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331154,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331154
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.02845263998508801,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.02845263998508801
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4180064308681672,
"acc_stderr": 0.028013651891995072,
"acc_norm": 0.4180064308681672,
"acc_norm_stderr": 0.028013651891995072
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.027586006221607715,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.027586006221607715
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.30141843971631205,
"acc_stderr": 0.027374128882631146,
"acc_norm": 0.30141843971631205,
"acc_norm_stderr": 0.027374128882631146
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.303129074315515,
"acc_stderr": 0.011738669951254293,
"acc_norm": 0.303129074315515,
"acc_norm_stderr": 0.011738669951254293
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.35661764705882354,
"acc_stderr": 0.029097209568411955,
"acc_norm": 0.35661764705882354,
"acc_norm_stderr": 0.029097209568411955
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.01933314202079706,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.01933314202079706
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.04782001791380063,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.04782001791380063
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.40816326530612246,
"acc_stderr": 0.03146465712827424,
"acc_norm": 0.40816326530612246,
"acc_norm_stderr": 0.03146465712827424
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5124378109452736,
"acc_stderr": 0.03534439848539579,
"acc_norm": 0.5124378109452736,
"acc_norm_stderr": 0.03534439848539579
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-virology|5": {
"acc": 0.35542168674698793,
"acc_stderr": 0.03726214354322415,
"acc_norm": 0.35542168674698793,
"acc_norm_stderr": 0.03726214354322415
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.47953216374269003,
"acc_stderr": 0.038316105328219316,
"acc_norm": 0.47953216374269003,
"acc_norm_stderr": 0.038316105328219316
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487291,
"mc2": 0.35450592505891126,
"mc2_stderr": 0.014292262562897113
},
"harness|winogrande|5": {
"acc": 0.6361483820047356,
"acc_stderr": 0.013521488896883408
},
"harness|gsm8k|5": {
"acc": 0.09704321455648218,
"acc_stderr": 0.008153768274554735
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/ekhidna_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ekhidna (Fire Emblem)
This is the dataset of ekhidna (Fire Emblem), containing 20 images and their tags.
The core tags of this character are `blue_eyes, blue_hair, earrings, headband, breasts, long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 21.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ekhidna_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 13.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ekhidna_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 40 | 24.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ekhidna_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 19.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ekhidna_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 40 | 32.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ekhidna_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ekhidna_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, solo, shoulder_spikes, smile, fingerless_gloves, weapon, bandana, jewelry, belt, looking_at_viewer, axe, boots, pauldrons, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | shoulder_spikes | smile | fingerless_gloves | weapon | bandana | jewelry | belt | looking_at_viewer | axe | boots | pauldrons | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:------------------|:--------|:--------------------|:---------|:----------|:----------|:-------|:--------------------|:------|:--------|:------------|:--------------------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
zhangshuoming/c_arm64_small | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 87599526
num_examples: 19949
download_size: 23472860
dataset_size: 87599526
---
# Dataset Card for "c_arm64_small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/random_letter_same_length_find_passage_train400_eval40_num | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 286858
num_examples: 840
- name: validation
num_bytes: 15422
num_examples: 40
download_size: 128731
dataset_size: 302280
---
# Dataset Card for "random_letter_same_length_find_passage_train400_eval40_num"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.