datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
naphatmanu/ikea-international-modern | ---
license: mit
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/90db6fe0 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1342
dataset_size: 182
---
# Dataset Card for "90db6fe0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_s3nh__Severusectum-7B-DPO | ---
pretty_name: Evaluation run of s3nh/Severusectum-7B-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [s3nh/Severusectum-7B-DPO](https://huggingface.co/s3nh/Severusectum-7B-DPO) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_s3nh__Severusectum-7B-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-04T00:26:53.768955](https://huggingface.co/datasets/open-llm-leaderboard/details_s3nh__Severusectum-7B-DPO/blob/main/results_2024-02-04T00-26-53.768955.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6535295097742561,\n\
\ \"acc_stderr\": 0.03205169461375925,\n \"acc_norm\": 0.6531125947259314,\n\
\ \"acc_norm_stderr\": 0.03271927818828212,\n \"mc1\": 0.5520195838433293,\n\
\ \"mc1_stderr\": 0.017408513063422917,\n \"mc2\": 0.7245391094382377,\n\
\ \"mc2_stderr\": 0.01445327594903656\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6945392491467577,\n \"acc_stderr\": 0.013460080478002508,\n\
\ \"acc_norm\": 0.7150170648464164,\n \"acc_norm_stderr\": 0.013191348179838795\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6998605855407289,\n\
\ \"acc_stderr\": 0.00457381716300745,\n \"acc_norm\": 0.8854809798844852,\n\
\ \"acc_norm_stderr\": 0.0031778979482849352\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531003,\n \"\
acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531003\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n \"acc_norm\"\
: 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092444,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092444\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.01377869377846408,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.01377869377846408\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508297,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508297\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39888268156424583,\n\
\ \"acc_stderr\": 0.01637696614261008,\n \"acc_norm\": 0.39888268156424583,\n\
\ \"acc_norm_stderr\": 0.01637696614261008\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n\
\ \"acc_stderr\": 0.012749206007657473,\n \"acc_norm\": 0.47131681877444587,\n\
\ \"acc_norm_stderr\": 0.012749206007657473\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5520195838433293,\n\
\ \"mc1_stderr\": 0.017408513063422917,\n \"mc2\": 0.7245391094382377,\n\
\ \"mc2_stderr\": 0.01445327594903656\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828075\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7050796057619408,\n \
\ \"acc_stderr\": 0.012560698010954769\n }\n}\n```"
repo_url: https://huggingface.co/s3nh/Severusectum-7B-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|arc:challenge|25_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|gsm8k|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hellaswag|10_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T00-26-53.768955.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-04T00-26-53.768955.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- '**/details_harness|winogrande|5_2024-02-04T00-26-53.768955.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-04T00-26-53.768955.parquet'
- config_name: results
data_files:
- split: 2024_02_04T00_26_53.768955
path:
- results_2024-02-04T00-26-53.768955.parquet
- split: latest
path:
- results_2024-02-04T00-26-53.768955.parquet
---
# Dataset Card for Evaluation run of s3nh/Severusectum-7B-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [s3nh/Severusectum-7B-DPO](https://huggingface.co/s3nh/Severusectum-7B-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_s3nh__Severusectum-7B-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T00:26:53.768955](https://huggingface.co/datasets/open-llm-leaderboard/details_s3nh__Severusectum-7B-DPO/blob/main/results_2024-02-04T00-26-53.768955.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6535295097742561,
"acc_stderr": 0.03205169461375925,
"acc_norm": 0.6531125947259314,
"acc_norm_stderr": 0.03271927818828212,
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422917,
"mc2": 0.7245391094382377,
"mc2_stderr": 0.01445327594903656
},
"harness|arc:challenge|25": {
"acc": 0.6945392491467577,
"acc_stderr": 0.013460080478002508,
"acc_norm": 0.7150170648464164,
"acc_norm_stderr": 0.013191348179838795
},
"harness|hellaswag|10": {
"acc": 0.6998605855407289,
"acc_stderr": 0.00457381716300745,
"acc_norm": 0.8854809798844852,
"acc_norm_stderr": 0.0031778979482849352
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.025559920550531003,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.025559920550531003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092444,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092444
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.01377869377846408,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.01377869377846408
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508297,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508297
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39888268156424583,
"acc_stderr": 0.01637696614261008,
"acc_norm": 0.39888268156424583,
"acc_norm_stderr": 0.01637696614261008
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.012749206007657473,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.012749206007657473
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422917,
"mc2": 0.7245391094382377,
"mc2_stderr": 0.01445327594903656
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828075
},
"harness|gsm8k|5": {
"acc": 0.7050796057619408,
"acc_stderr": 0.012560698010954769
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
TalTechNLP/err-newsroom-keyphrases | ---
license: cc-by-4.0
task_categories:
- summarization
- text2text-generation
language:
- et
pretty_name: ERR Newsroom Keyphrases
---
# ERR Newsroom Keyphrases
This dataset is a subset of [ERR Newsroom](https://huggingface.co/datasets/TalTechNLP/err-newsroom), with up to 5 keyphrases assigned to each news article. The keyphrases are generated using the OpenAI API, using the `gpt-3.5-turbo` model (see the script `extract-keywords-openai.py`). |
SubsWay/my_whisper_test | ---
license: mit
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 7685160
num_examples: 8
- name: test
num_bytes: 960648
num_examples: 1
- name: valid
num_bytes: 960512
num_examples: 1
download_size: 2534073
dataset_size: 9606320
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
ibivibiv/alpaca_tiny12 | ---
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 460677255
num_examples: 290901
download_size: 266439258
dataset_size: 460677255
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
unalignment/toxic-dpo-v0.2 | ---
license: cc-by-4.0
tags:
- not-for-all-audiences
---
## Toxic-DPO
This is a highly toxic, "harmful" dataset meant to illustrate how DPO can be used to de-censor/unalign a model quite easily using direct-preference-optimization (DPO) using very few examples.
Many of the examples still contain some amount of warnings/disclaimers, so it's still somewhat editorialized.
## Usage restriction
To use this data, you must acknowledge/agree to the following:
- data contained within is "toxic"/"harmful", and contains profanity and other types of sensitive content
- none of the content or views contained in the dataset necessarily align with my personal beliefs or opinions, they are simply text generated by LLMs automatically
- you are able to use the dataset lawfully, particularly in locations with less-than-free speech laws
- you, and you alone are responsible for having downloaded and used the dataset, and I am completely indemnified from any and all liabilities
This dataset is meant __*exclusively*__ for academic/research or other non-nefarious use-cases. |
open-llm-leaderboard/details_psyche__kollama2-7b-v3 | ---
pretty_name: Evaluation run of psyche/kollama2-7b-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [psyche/kollama2-7b-v3](https://huggingface.co/psyche/kollama2-7b-v3) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psyche__kollama2-7b-v3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-28T08:31:05.396495](https://huggingface.co/datasets/open-llm-leaderboard/details_psyche__kollama2-7b-v3/blob/main/results_2023-08-28T08%3A31%3A05.396495.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.4080126650463043,\n \"\
acc_stderr\": 0.03490803118091981,\n \"acc_norm\": 0.41212249060720096,\n\
\ \"acc_norm_stderr\": 0.034895302556044526,\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236618,\n \"mc2\": 0.42921423081004945,\n\
\ \"mc2_stderr\": 0.014206971382449723\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4539249146757679,\n \"acc_stderr\": 0.01454922110517187,\n\
\ \"acc_norm\": 0.4974402730375427,\n \"acc_norm_stderr\": 0.014611199329843784\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5855407289384584,\n\
\ \"acc_stderr\": 0.004916216503770336,\n \"acc_norm\": 0.7845050786695877,\n\
\ \"acc_norm_stderr\": 0.004103249411456488\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3925925925925926,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.3925925925925926,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.35526315789473684,\n \"acc_stderr\": 0.038947344870133176,\n\
\ \"acc_norm\": 0.35526315789473684,\n \"acc_norm_stderr\": 0.038947344870133176\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.37735849056603776,\n \"acc_stderr\": 0.029832808114796005,\n\
\ \"acc_norm\": 0.37735849056603776,\n \"acc_norm_stderr\": 0.029832808114796005\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4097222222222222,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.4097222222222222,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.3236994219653179,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179963,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179963\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.38620689655172413,\n \"acc_stderr\": 0.04057324734419034,\n\
\ \"acc_norm\": 0.38620689655172413,\n \"acc_norm_stderr\": 0.04057324734419034\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23544973544973544,\n \"acc_stderr\": 0.021851509822031722,\n \"\
acc_norm\": 0.23544973544973544,\n \"acc_norm_stderr\": 0.021851509822031722\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.04190596438871137,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.04190596438871137\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3935483870967742,\n\
\ \"acc_stderr\": 0.027791878753132274,\n \"acc_norm\": 0.3935483870967742,\n\
\ \"acc_norm_stderr\": 0.027791878753132274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868408,\n\
\ \"acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868408\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4909090909090909,\n \"acc_stderr\": 0.0390369864774844,\n\
\ \"acc_norm\": 0.4909090909090909,\n \"acc_norm_stderr\": 0.0390369864774844\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.41919191919191917,\n \"acc_stderr\": 0.035155207286704175,\n \"\
acc_norm\": 0.41919191919191917,\n \"acc_norm_stderr\": 0.035155207286704175\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5958549222797928,\n \"acc_stderr\": 0.0354150857888402,\n\
\ \"acc_norm\": 0.5958549222797928,\n \"acc_norm_stderr\": 0.0354150857888402\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3487179487179487,\n \"acc_stderr\": 0.02416278028401772,\n \
\ \"acc_norm\": 0.3487179487179487,\n \"acc_norm_stderr\": 0.02416278028401772\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3445378151260504,\n \"acc_stderr\": 0.030868682604121626,\n\
\ \"acc_norm\": 0.3445378151260504,\n \"acc_norm_stderr\": 0.030868682604121626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.0347918557259966,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.0347918557259966\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.48807339449541287,\n \"acc_stderr\": 0.021431223617362223,\n \"\
acc_norm\": 0.48807339449541287,\n \"acc_norm_stderr\": 0.021431223617362223\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.23148148148148148,\n \"acc_stderr\": 0.028765111718046965,\n \"\
acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.028765111718046965\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.44607843137254904,\n \"acc_stderr\": 0.03488845451304974,\n \"\
acc_norm\": 0.44607843137254904,\n \"acc_norm_stderr\": 0.03488845451304974\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4430379746835443,\n \"acc_stderr\": 0.03233532777533484,\n \
\ \"acc_norm\": 0.4430379746835443,\n \"acc_norm_stderr\": 0.03233532777533484\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.515695067264574,\n\
\ \"acc_stderr\": 0.0335412657542081,\n \"acc_norm\": 0.515695067264574,\n\
\ \"acc_norm_stderr\": 0.0335412657542081\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4732824427480916,\n \"acc_stderr\": 0.04379024936553894,\n\
\ \"acc_norm\": 0.4732824427480916,\n \"acc_norm_stderr\": 0.04379024936553894\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5785123966942148,\n \"acc_stderr\": 0.04507732278775087,\n \"\
acc_norm\": 0.5785123966942148,\n \"acc_norm_stderr\": 0.04507732278775087\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.42592592592592593,\n\
\ \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.42592592592592593,\n\
\ \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4294478527607362,\n \"acc_stderr\": 0.03889066619112722,\n\
\ \"acc_norm\": 0.4294478527607362,\n \"acc_norm_stderr\": 0.03889066619112722\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.47572815533980584,\n \"acc_stderr\": 0.049449010929737795,\n\
\ \"acc_norm\": 0.47572815533980584,\n \"acc_norm_stderr\": 0.049449010929737795\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6025641025641025,\n\
\ \"acc_stderr\": 0.032059534537892925,\n \"acc_norm\": 0.6025641025641025,\n\
\ \"acc_norm_stderr\": 0.032059534537892925\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5938697318007663,\n\
\ \"acc_stderr\": 0.017562037406478923,\n \"acc_norm\": 0.5938697318007663,\n\
\ \"acc_norm_stderr\": 0.017562037406478923\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.45664739884393063,\n \"acc_stderr\": 0.02681771813034892,\n\
\ \"acc_norm\": 0.45664739884393063,\n \"acc_norm_stderr\": 0.02681771813034892\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.40522875816993464,\n \"acc_stderr\": 0.02811092849280908,\n\
\ \"acc_norm\": 0.40522875816993464,\n \"acc_norm_stderr\": 0.02811092849280908\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5273311897106109,\n\
\ \"acc_stderr\": 0.02835563356832818,\n \"acc_norm\": 0.5273311897106109,\n\
\ \"acc_norm_stderr\": 0.02835563356832818\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4567901234567901,\n \"acc_stderr\": 0.027716661650194045,\n\
\ \"acc_norm\": 0.4567901234567901,\n \"acc_norm_stderr\": 0.027716661650194045\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611327,\n \
\ \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611327\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.33572359843546284,\n\
\ \"acc_stderr\": 0.012061304157664604,\n \"acc_norm\": 0.33572359843546284,\n\
\ \"acc_norm_stderr\": 0.012061304157664604\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.36764705882352944,\n \"acc_stderr\": 0.029289413409403196,\n\
\ \"acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.029289413409403196\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4150326797385621,\n \"acc_stderr\": 0.01993362777685741,\n \
\ \"acc_norm\": 0.4150326797385621,\n \"acc_norm_stderr\": 0.01993362777685741\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794917,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794917\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.27755102040816326,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.27755102040816326,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5074626865671642,\n\
\ \"acc_stderr\": 0.035351400842767194,\n \"acc_norm\": 0.5074626865671642,\n\
\ \"acc_norm_stderr\": 0.035351400842767194\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3674698795180723,\n\
\ \"acc_stderr\": 0.03753267402120575,\n \"acc_norm\": 0.3674698795180723,\n\
\ \"acc_norm_stderr\": 0.03753267402120575\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6081871345029239,\n \"acc_stderr\": 0.037439798259263996,\n\
\ \"acc_norm\": 0.6081871345029239,\n \"acc_norm_stderr\": 0.037439798259263996\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236618,\n \"mc2\": 0.42921423081004945,\n\
\ \"mc2_stderr\": 0.014206971382449723\n }\n}\n```"
repo_url: https://huggingface.co/psyche/kollama2-7b-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|arc:challenge|25_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hellaswag|10_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T08:31:05.396495.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T08:31:05.396495.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T08:31:05.396495.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T08:31:05.396495.parquet'
- config_name: results
data_files:
- split: 2023_08_28T08_31_05.396495
path:
- results_2023-08-28T08:31:05.396495.parquet
- split: latest
path:
- results_2023-08-28T08:31:05.396495.parquet
---
# Dataset Card for Evaluation run of psyche/kollama2-7b-v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psyche/kollama2-7b-v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [psyche/kollama2-7b-v3](https://huggingface.co/psyche/kollama2-7b-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psyche__kollama2-7b-v3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-28T08:31:05.396495](https://huggingface.co/datasets/open-llm-leaderboard/details_psyche__kollama2-7b-v3/blob/main/results_2023-08-28T08%3A31%3A05.396495.json):
```python
{
"all": {
"acc": 0.4080126650463043,
"acc_stderr": 0.03490803118091981,
"acc_norm": 0.41212249060720096,
"acc_norm_stderr": 0.034895302556044526,
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236618,
"mc2": 0.42921423081004945,
"mc2_stderr": 0.014206971382449723
},
"harness|arc:challenge|25": {
"acc": 0.4539249146757679,
"acc_stderr": 0.01454922110517187,
"acc_norm": 0.4974402730375427,
"acc_norm_stderr": 0.014611199329843784
},
"harness|hellaswag|10": {
"acc": 0.5855407289384584,
"acc_stderr": 0.004916216503770336,
"acc_norm": 0.7845050786695877,
"acc_norm_stderr": 0.004103249411456488
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3925925925925926,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.3925925925925926,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.35526315789473684,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.35526315789473684,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.37735849056603776,
"acc_stderr": 0.029832808114796005,
"acc_norm": 0.37735849056603776,
"acc_norm_stderr": 0.029832808114796005
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4097222222222222,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.4097222222222222,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179963,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179963
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.38620689655172413,
"acc_stderr": 0.04057324734419034,
"acc_norm": 0.38620689655172413,
"acc_norm_stderr": 0.04057324734419034
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23544973544973544,
"acc_stderr": 0.021851509822031722,
"acc_norm": 0.23544973544973544,
"acc_norm_stderr": 0.021851509822031722
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871137,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871137
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3935483870967742,
"acc_stderr": 0.027791878753132274,
"acc_norm": 0.3935483870967742,
"acc_norm_stderr": 0.027791878753132274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868408,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868408
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.0390369864774844,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.0390369864774844
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.41919191919191917,
"acc_stderr": 0.035155207286704175,
"acc_norm": 0.41919191919191917,
"acc_norm_stderr": 0.035155207286704175
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5958549222797928,
"acc_stderr": 0.0354150857888402,
"acc_norm": 0.5958549222797928,
"acc_norm_stderr": 0.0354150857888402
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3487179487179487,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.3487179487179487,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3445378151260504,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.3445378151260504,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.0347918557259966,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.0347918557259966
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.48807339449541287,
"acc_stderr": 0.021431223617362223,
"acc_norm": 0.48807339449541287,
"acc_norm_stderr": 0.021431223617362223
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.028765111718046965,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.028765111718046965
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.44607843137254904,
"acc_stderr": 0.03488845451304974,
"acc_norm": 0.44607843137254904,
"acc_norm_stderr": 0.03488845451304974
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4430379746835443,
"acc_stderr": 0.03233532777533484,
"acc_norm": 0.4430379746835443,
"acc_norm_stderr": 0.03233532777533484
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.515695067264574,
"acc_stderr": 0.0335412657542081,
"acc_norm": 0.515695067264574,
"acc_norm_stderr": 0.0335412657542081
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4732824427480916,
"acc_stderr": 0.04379024936553894,
"acc_norm": 0.4732824427480916,
"acc_norm_stderr": 0.04379024936553894
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5785123966942148,
"acc_stderr": 0.04507732278775087,
"acc_norm": 0.5785123966942148,
"acc_norm_stderr": 0.04507732278775087
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.0478034362693679,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.0478034362693679
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4294478527607362,
"acc_stderr": 0.03889066619112722,
"acc_norm": 0.4294478527607362,
"acc_norm_stderr": 0.03889066619112722
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.47572815533980584,
"acc_stderr": 0.049449010929737795,
"acc_norm": 0.47572815533980584,
"acc_norm_stderr": 0.049449010929737795
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.032059534537892925,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.032059534537892925
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5938697318007663,
"acc_stderr": 0.017562037406478923,
"acc_norm": 0.5938697318007663,
"acc_norm_stderr": 0.017562037406478923
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.45664739884393063,
"acc_stderr": 0.02681771813034892,
"acc_norm": 0.45664739884393063,
"acc_norm_stderr": 0.02681771813034892
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.40522875816993464,
"acc_stderr": 0.02811092849280908,
"acc_norm": 0.40522875816993464,
"acc_norm_stderr": 0.02811092849280908
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5273311897106109,
"acc_stderr": 0.02835563356832818,
"acc_norm": 0.5273311897106109,
"acc_norm_stderr": 0.02835563356832818
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4567901234567901,
"acc_stderr": 0.027716661650194045,
"acc_norm": 0.4567901234567901,
"acc_norm_stderr": 0.027716661650194045
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611327,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611327
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.33572359843546284,
"acc_stderr": 0.012061304157664604,
"acc_norm": 0.33572359843546284,
"acc_norm_stderr": 0.012061304157664604
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.36764705882352944,
"acc_stderr": 0.029289413409403196,
"acc_norm": 0.36764705882352944,
"acc_norm_stderr": 0.029289413409403196
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4150326797385621,
"acc_stderr": 0.01993362777685741,
"acc_norm": 0.4150326797385621,
"acc_norm_stderr": 0.01993362777685741
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794917,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794917
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.27755102040816326,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.27755102040816326,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5074626865671642,
"acc_stderr": 0.035351400842767194,
"acc_norm": 0.5074626865671642,
"acc_norm_stderr": 0.035351400842767194
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3674698795180723,
"acc_stderr": 0.03753267402120575,
"acc_norm": 0.3674698795180723,
"acc_norm_stderr": 0.03753267402120575
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6081871345029239,
"acc_stderr": 0.037439798259263996,
"acc_norm": 0.6081871345029239,
"acc_norm_stderr": 0.037439798259263996
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236618,
"mc2": 0.42921423081004945,
"mc2_stderr": 0.014206971382449723
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Hiraishin/Reddit-Malaysia | ---
license: apache-2.0
language:
- en
- ms
---
# Reddit Crawler on Malaysia Subreddit using Selenium
This Hugging Face dataset repository serves as a dedicated data store for an Extract, Transform, Load (ETL) pipeline designed using MageAI. The pipeline is specifically crafted for harvesting data from the Malaysia subreddit on Reddit. Leveraging Selenium, this ETL process systematically collects information from four distinct sections of the subreddit: Hot, New, Rising, Controversial, and Top.
# Usage
This dataset is specifically curated for users aiming to train Language Models (LLMs) by providing a rich and diverse set of data from the Malaysia subreddit. With a focus on fostering language understanding and generation, this dataset is a valuable resource for training LLMs capable of capturing the nuances and dynamics of online discussions. |
heliosprime/twitter_dataset_1713160570 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 14033
num_examples: 36
download_size: 15893
dataset_size: 14033
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713160570"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-inverse-scaling__redefine-math-inverse-scaling__redefin-f7efd9-1695359604 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- inverse-scaling/redefine-math
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-30b_eval
metrics: []
dataset_name: inverse-scaling/redefine-math
dataset_config: inverse-scaling--redefine-math
dataset_split: train
col_mapping:
text: prompt
classes: classes
target: answer_index
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-30b_eval
* Dataset: inverse-scaling/redefine-math
* Config: inverse-scaling--redefine-math
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@MicPie](https://huggingface.co/MicPie) for evaluating this model. |
diffusers-parti-prompts/sdxl-0.9-refiner | ---
dataset_info:
features:
- name: Prompt
dtype: string
- name: Category
dtype: string
- name: Challenge
dtype: string
- name: Note
dtype: string
- name: images
dtype: image
- name: model_name
dtype: string
- name: seed
dtype: int64
splits:
- name: train
num_bytes: 186370500.896
num_examples: 1632
download_size: 185820089
dataset_size: 186370500.896
---
# Dataset Card for "sdxl-0.9-refiner"
Dataset was generated using the code below:
```python
import torch
from datasets import Dataset, Features
from datasets import Image as ImageFeature
from datasets import Value, load_dataset
from diffusers import DDIMScheduler, DiffusionPipeline
import PIL
def main():
print("Loading dataset...")
parti_prompts = load_dataset("nateraw/parti-prompts", split="train")
print("Loading pipeline...")
ckpt_id = "stabilityai/stable-diffusion-xl-base-0.9"
refiner_ckpt_id = "stabilityai/stable-diffusion-xl-refiner-0.9"
pipe = DiffusionPipeline.from_pretrained(
ckpt_id, torch_dtype=torch.float16, use_auth_token=True
).to("cuda")
pipe.scheduler = DDIMScheduler.from_config(pipe.scheduler.config)
pipe.set_progress_bar_config(disable=True)
refiner = DiffusionPipeline.from_pretrained(
refiner_ckpt_id,
torch_dtype=torch.float16,
use_auth_token=True
).to("cuda")
refiner.scheduler = DDIMScheduler.from_config(refiner.scheduler.config)
refiner.set_progress_bar_config(disable=True)
seed = 0
generator = torch.Generator("cuda").manual_seed(seed)
print("Running inference...")
main_dict = {}
for i in range(len(parti_prompts)):
sample = parti_prompts[i]
prompt = sample["Prompt"]
latent = pipe(
prompt,
generator=generator,
num_inference_steps=100,
guidance_scale=7.5,
output_type="latent",
).images[0]
image_refined = refiner(
prompt=prompt,
image=latent[None, :],
generator=generator,
num_inference_steps=100,
guidance_scale=7.5,
).images[0]
image = image_refined.resize((256, 256), resample=PIL.Image.Resampling.LANCZOS)
img_path = f"sd_xl_{i}.png"
image.save(img_path)
main_dict.update(
{
prompt: {
"img_path": img_path,
"Category": sample["Category"],
"Challenge": sample["Challenge"],
"Note": sample["Note"],
"model_name": ckpt_id,
"seed": seed,
}
}
)
def generation_fn():
for prompt in main_dict:
prompt_entry = main_dict[prompt]
yield {
"Prompt": prompt,
"Category": prompt_entry["Category"],
"Challenge": prompt_entry["Challenge"],
"Note": prompt_entry["Note"],
"images": {"path": prompt_entry["img_path"]},
"model_name": prompt_entry["model_name"],
"seed": prompt_entry["seed"],
}
print("Preparing HF dataset...")
ds = Dataset.from_generator(
generation_fn,
features=Features(
Prompt=Value("string"),
Category=Value("string"),
Challenge=Value("string"),
Note=Value("string"),
images=ImageFeature(),
model_name=Value("string"),
seed=Value("int64"),
),
)
ds_id = "diffusers-parti-prompts/sdxl-0.9-refiner"
ds.push_to_hub(ds_id)
if __name__ == "__main__":
main()
``` |
kimvu/agieval | ---
license: apache-2.0
language:
- en
- zh
pretty_name: agieval_full
--- |
KennNguyenDev/FiQA_Financial_Phrasebank_Combined | ---
license: cc0-1.0
task_categories:
- text-classification
language:
- en
tags:
- finance
size_categories:
- 1K<n<10K
---
Altared Version Of Dataset From: https://www.kaggle.com/datasets/sbhatti/financial-sentiment-analysis
Changed sentiment labels into values |
open-llm-leaderboard/details_PetroGPT__Voldemort-10B-DPO | ---
pretty_name: Evaluation run of PetroGPT/Voldemort-10B-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PetroGPT/Voldemort-10B-DPO](https://huggingface.co/PetroGPT/Voldemort-10B-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PetroGPT__Voldemort-10B-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-20T12:02:57.927448](https://huggingface.co/datasets/open-llm-leaderboard/details_PetroGPT__Voldemort-10B-DPO/blob/main/results_2024-01-20T12-02-57.927448.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6305876260706662,\n\
\ \"acc_stderr\": 0.03255938653931723,\n \"acc_norm\": 0.6330868385686215,\n\
\ \"acc_norm_stderr\": 0.033208227030172364,\n \"mc1\": 0.41615667074663404,\n\
\ \"mc1_stderr\": 0.017255657502903046,\n \"mc2\": 0.6144474102286928,\n\
\ \"mc2_stderr\": 0.015672191454631425\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6407849829351536,\n \"acc_stderr\": 0.014020224155839159,\n\
\ \"acc_norm\": 0.6604095563139932,\n \"acc_norm_stderr\": 0.013839039762820169\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6731726747659829,\n\
\ \"acc_stderr\": 0.004680949283855316,\n \"acc_norm\": 0.8484365664210317,\n\
\ \"acc_norm_stderr\": 0.0035786433875478452\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.037150621549989056,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.037150621549989056\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782655,\n \"\
acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782655\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959217,\n \"\
acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959217\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124484,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124484\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110936,\n\
\ \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110936\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135356,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135356\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.036848815213890225,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.036848815213890225\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8091743119266055,\n \"acc_stderr\": 0.016847676400091098,\n \"\
acc_norm\": 0.8091743119266055,\n \"acc_norm_stderr\": 0.016847676400091098\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n\
\ \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069436,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069436\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281382,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281382\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464076,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464076\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677006,\n\
\ \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677006\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38324022346368714,\n\
\ \"acc_stderr\": 0.016260159604429128,\n \"acc_norm\": 0.38324022346368714,\n\
\ \"acc_norm_stderr\": 0.016260159604429128\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.02540719779889016,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.02540719779889016\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4367666232073012,\n\
\ \"acc_stderr\": 0.012667701919603662,\n \"acc_norm\": 0.4367666232073012,\n\
\ \"acc_norm_stderr\": 0.012667701919603662\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.02873932851398357,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.02873932851398357\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6503267973856209,\n \"acc_stderr\": 0.019291961895066382,\n \
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.019291961895066382\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786855,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786855\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41615667074663404,\n\
\ \"mc1_stderr\": 0.017255657502903046,\n \"mc2\": 0.6144474102286928,\n\
\ \"mc2_stderr\": 0.015672191454631425\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838229\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5382865807429871,\n \
\ \"acc_stderr\": 0.01373204822701668\n }\n}\n```"
repo_url: https://huggingface.co/PetroGPT/Voldemort-10B-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|arc:challenge|25_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|arc:challenge|25_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|gsm8k|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|gsm8k|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hellaswag|10_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hellaswag|10_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T09-59-49.442476.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T12-02-57.927448.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-20T12-02-57.927448.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- '**/details_harness|winogrande|5_2024-01-20T09-59-49.442476.parquet'
- split: 2024_01_20T12_02_57.927448
path:
- '**/details_harness|winogrande|5_2024-01-20T12-02-57.927448.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-20T12-02-57.927448.parquet'
- config_name: results
data_files:
- split: 2024_01_20T09_59_49.442476
path:
- results_2024-01-20T09-59-49.442476.parquet
- split: 2024_01_20T12_02_57.927448
path:
- results_2024-01-20T12-02-57.927448.parquet
- split: latest
path:
- results_2024-01-20T12-02-57.927448.parquet
---
# Dataset Card for Evaluation run of PetroGPT/Voldemort-10B-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [PetroGPT/Voldemort-10B-DPO](https://huggingface.co/PetroGPT/Voldemort-10B-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PetroGPT__Voldemort-10B-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T12:02:57.927448](https://huggingface.co/datasets/open-llm-leaderboard/details_PetroGPT__Voldemort-10B-DPO/blob/main/results_2024-01-20T12-02-57.927448.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6305876260706662,
"acc_stderr": 0.03255938653931723,
"acc_norm": 0.6330868385686215,
"acc_norm_stderr": 0.033208227030172364,
"mc1": 0.41615667074663404,
"mc1_stderr": 0.017255657502903046,
"mc2": 0.6144474102286928,
"mc2_stderr": 0.015672191454631425
},
"harness|arc:challenge|25": {
"acc": 0.6407849829351536,
"acc_stderr": 0.014020224155839159,
"acc_norm": 0.6604095563139932,
"acc_norm_stderr": 0.013839039762820169
},
"harness|hellaswag|10": {
"acc": 0.6731726747659829,
"acc_stderr": 0.004680949283855316,
"acc_norm": 0.8484365664210317,
"acc_norm_stderr": 0.0035786433875478452
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.037150621549989056,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.037150621549989056
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.03510766597959217,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.03510766597959217
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124484,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124484
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110936,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110936
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135356,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135356
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.036848815213890225,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.036848815213890225
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8091743119266055,
"acc_stderr": 0.016847676400091098,
"acc_norm": 0.8091743119266055,
"acc_norm_stderr": 0.016847676400091098
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069436,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281382,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281382
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464076,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464076
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388677006,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388677006
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38324022346368714,
"acc_stderr": 0.016260159604429128,
"acc_norm": 0.38324022346368714,
"acc_norm_stderr": 0.016260159604429128
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.02540719779889016,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.02540719779889016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4367666232073012,
"acc_stderr": 0.012667701919603662,
"acc_norm": 0.4367666232073012,
"acc_norm_stderr": 0.012667701919603662
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.02873932851398357,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.02873932851398357
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.019291961895066382,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.019291961895066382
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786855,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786855
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41615667074663404,
"mc1_stderr": 0.017255657502903046,
"mc2": 0.6144474102286928,
"mc2_stderr": 0.015672191454631425
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838229
},
"harness|gsm8k|5": {
"acc": 0.5382865807429871,
"acc_stderr": 0.01373204822701668
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
TheAIchemist13/hindi_asr_dataset_2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcriptions
dtype: string
splits:
- name: train
num_bytes: 60362774.0
num_examples: 175
- name: test
num_bytes: 3849203.0
num_examples: 5
download_size: 59670172
dataset_size: 64211977.0
---
# Dataset Card for "hindi_asr_dataset_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_uukuguy__speechless-coder-ds-1.3b | ---
pretty_name: Evaluation run of uukuguy/speechless-coder-ds-1.3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/speechless-coder-ds-1.3b](https://huggingface.co/uukuguy/speechless-coder-ds-1.3b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-coder-ds-1.3b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T06:48:01.416618](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-coder-ds-1.3b/blob/main/results_2023-12-30T06-48-01.416618.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2506488389989993,\n\
\ \"acc_stderr\": 0.030580639760232627,\n \"acc_norm\": 0.25128428880523795,\n\
\ \"acc_norm_stderr\": 0.03131936078924142,\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156507,\n \"mc2\": 0.4211587968245106,\n\
\ \"mc2_stderr\": 0.01485785907132671\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.23890784982935154,\n \"acc_stderr\": 0.012461071376316614,\n\
\ \"acc_norm\": 0.26535836177474403,\n \"acc_norm_stderr\": 0.012902554762313967\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3313085042820155,\n\
\ \"acc_stderr\": 0.004697217912462985,\n \"acc_norm\": 0.39494124676359293,\n\
\ \"acc_norm_stderr\": 0.0048783902265917105\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.21481481481481482,\n\
\ \"acc_stderr\": 0.035478541985608236,\n \"acc_norm\": 0.21481481481481482,\n\
\ \"acc_norm_stderr\": 0.035478541985608236\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.16447368421052633,\n \"acc_stderr\": 0.0301675334686327,\n\
\ \"acc_norm\": 0.16447368421052633,\n \"acc_norm_stderr\": 0.0301675334686327\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.26037735849056604,\n \"acc_stderr\": 0.027008766090708094,\n\
\ \"acc_norm\": 0.26037735849056604,\n \"acc_norm_stderr\": 0.027008766090708094\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.16,\n \"acc_stderr\": 0.0368452949177471,\n \"acc_norm\"\
: 0.16,\n \"acc_norm_stderr\": 0.0368452949177471\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.03036358219723817,\n\
\ \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.03036358219723817\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708614,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708614\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.03567016675276862,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.03567016675276862\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.21935483870967742,\n\
\ \"acc_stderr\": 0.02354079935872329,\n \"acc_norm\": 0.21935483870967742,\n\
\ \"acc_norm_stderr\": 0.02354079935872329\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.21182266009852216,\n \"acc_stderr\": 0.02874898368994107,\n\
\ \"acc_norm\": 0.21182266009852216,\n \"acc_norm_stderr\": 0.02874898368994107\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24870466321243523,\n \"acc_stderr\": 0.031195840877700314,\n\
\ \"acc_norm\": 0.24870466321243523,\n \"acc_norm_stderr\": 0.031195840877700314\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.020932445774463196,\n\
\ \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.020932445774463196\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23703703703703705,\n \"acc_stderr\": 0.02592887613276612,\n \
\ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.02592887613276612\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436775,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23302752293577983,\n \"acc_stderr\": 0.018125669180861496,\n \"\
acc_norm\": 0.23302752293577983,\n \"acc_norm_stderr\": 0.018125669180861496\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37962962962962965,\n \"acc_stderr\": 0.03309682581119035,\n \"\
acc_norm\": 0.37962962962962965,\n \"acc_norm_stderr\": 0.03309682581119035\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.03058759135160424,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.03058759135160424\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.27848101265822783,\n \"acc_stderr\": 0.029178682304842562,\n \
\ \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.029178682304842562\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3183856502242152,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.3183856502242152,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2809917355371901,\n \"acc_stderr\": 0.04103203830514512,\n \"\
acc_norm\": 0.2809917355371901,\n \"acc_norm_stderr\": 0.04103203830514512\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286773,\n\
\ \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286773\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n\
\ \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n\
\ \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3247863247863248,\n\
\ \"acc_stderr\": 0.030679022765498828,\n \"acc_norm\": 0.3247863247863248,\n\
\ \"acc_norm_stderr\": 0.030679022765498828\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2937420178799489,\n\
\ \"acc_stderr\": 0.016287759388491682,\n \"acc_norm\": 0.2937420178799489,\n\
\ \"acc_norm_stderr\": 0.016287759388491682\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2837988826815642,\n\
\ \"acc_stderr\": 0.015078358970751778,\n \"acc_norm\": 0.2837988826815642,\n\
\ \"acc_norm_stderr\": 0.015078358970751778\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.024954184324879912,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.024954184324879912\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2090032154340836,\n\
\ \"acc_stderr\": 0.02309314039837422,\n \"acc_norm\": 0.2090032154340836,\n\
\ \"acc_norm_stderr\": 0.02309314039837422\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967277,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967277\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290392,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290392\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23598435462842243,\n\
\ \"acc_stderr\": 0.01084480266966268,\n \"acc_norm\": 0.23598435462842243,\n\
\ \"acc_norm_stderr\": 0.01084480266966268\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3088235294117647,\n \"acc_stderr\": 0.028064998167040094,\n\
\ \"acc_norm\": 0.3088235294117647,\n \"acc_norm_stderr\": 0.028064998167040094\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n\
\ \"acc_stderr\": 0.04461272175910507,\n \"acc_norm\": 0.3181818181818182,\n\
\ \"acc_norm_stderr\": 0.04461272175910507\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.15918367346938775,\n \"acc_stderr\": 0.02342097206916635,\n\
\ \"acc_norm\": 0.15918367346938775,\n \"acc_norm_stderr\": 0.02342097206916635\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156507,\n \"mc2\": 0.4211587968245106,\n\
\ \"mc2_stderr\": 0.01485785907132671\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5303867403314917,\n \"acc_stderr\": 0.014026510839428737\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02350265352539803,\n \
\ \"acc_stderr\": 0.004172883669643965\n }\n}\n```"
repo_url: https://huggingface.co/uukuguy/speechless-coder-ds-1.3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|arc:challenge|25_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|gsm8k|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hellaswag|10_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T06-48-01.416618.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T06-48-01.416618.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- '**/details_harness|winogrande|5_2023-12-30T06-48-01.416618.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T06-48-01.416618.parquet'
- config_name: results
data_files:
- split: 2023_12_30T06_48_01.416618
path:
- results_2023-12-30T06-48-01.416618.parquet
- split: latest
path:
- results_2023-12-30T06-48-01.416618.parquet
---
# Dataset Card for Evaluation run of uukuguy/speechless-coder-ds-1.3b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [uukuguy/speechless-coder-ds-1.3b](https://huggingface.co/uukuguy/speechless-coder-ds-1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-coder-ds-1.3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T06:48:01.416618](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-coder-ds-1.3b/blob/main/results_2023-12-30T06-48-01.416618.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2506488389989993,
"acc_stderr": 0.030580639760232627,
"acc_norm": 0.25128428880523795,
"acc_norm_stderr": 0.03131936078924142,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156507,
"mc2": 0.4211587968245106,
"mc2_stderr": 0.01485785907132671
},
"harness|arc:challenge|25": {
"acc": 0.23890784982935154,
"acc_stderr": 0.012461071376316614,
"acc_norm": 0.26535836177474403,
"acc_norm_stderr": 0.012902554762313967
},
"harness|hellaswag|10": {
"acc": 0.3313085042820155,
"acc_stderr": 0.004697217912462985,
"acc_norm": 0.39494124676359293,
"acc_norm_stderr": 0.0048783902265917105
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.21481481481481482,
"acc_stderr": 0.035478541985608236,
"acc_norm": 0.21481481481481482,
"acc_norm_stderr": 0.035478541985608236
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.16447368421052633,
"acc_stderr": 0.0301675334686327,
"acc_norm": 0.16447368421052633,
"acc_norm_stderr": 0.0301675334686327
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.26037735849056604,
"acc_stderr": 0.027008766090708094,
"acc_norm": 0.26037735849056604,
"acc_norm_stderr": 0.027008766090708094
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.16,
"acc_stderr": 0.0368452949177471,
"acc_norm": 0.16,
"acc_norm_stderr": 0.0368452949177471
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3148936170212766,
"acc_stderr": 0.03036358219723817,
"acc_norm": 0.3148936170212766,
"acc_norm_stderr": 0.03036358219723817
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135303,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135303
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708614,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708614
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276862,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276862
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.21935483870967742,
"acc_stderr": 0.02354079935872329,
"acc_norm": 0.21935483870967742,
"acc_norm_stderr": 0.02354079935872329
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21182266009852216,
"acc_stderr": 0.02874898368994107,
"acc_norm": 0.21182266009852216,
"acc_norm_stderr": 0.02874898368994107
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24870466321243523,
"acc_stderr": 0.031195840877700314,
"acc_norm": 0.24870466321243523,
"acc_norm_stderr": 0.031195840877700314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.020932445774463196,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.020932445774463196
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.02592887613276612,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.02592887613276612
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436775,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23302752293577983,
"acc_stderr": 0.018125669180861496,
"acc_norm": 0.23302752293577983,
"acc_norm_stderr": 0.018125669180861496
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.03058759135160424,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.03058759135160424
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.029178682304842562,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.029178682304842562
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3183856502242152,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.3183856502242152,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2809917355371901,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.2809917355371901,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286773,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286773
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3247863247863248,
"acc_stderr": 0.030679022765498828,
"acc_norm": 0.3247863247863248,
"acc_norm_stderr": 0.030679022765498828
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2937420178799489,
"acc_stderr": 0.016287759388491682,
"acc_norm": 0.2937420178799489,
"acc_norm_stderr": 0.016287759388491682
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2837988826815642,
"acc_stderr": 0.015078358970751778,
"acc_norm": 0.2837988826815642,
"acc_norm_stderr": 0.015078358970751778
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.024954184324879912,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.024954184324879912
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2090032154340836,
"acc_stderr": 0.02309314039837422,
"acc_norm": 0.2090032154340836,
"acc_norm_stderr": 0.02309314039837422
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967277,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967277
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290392,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290392
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23598435462842243,
"acc_stderr": 0.01084480266966268,
"acc_norm": 0.23598435462842243,
"acc_norm_stderr": 0.01084480266966268
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3088235294117647,
"acc_stderr": 0.028064998167040094,
"acc_norm": 0.3088235294117647,
"acc_norm_stderr": 0.028064998167040094
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.04461272175910507,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.04461272175910507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.15918367346938775,
"acc_stderr": 0.02342097206916635,
"acc_norm": 0.15918367346938775,
"acc_norm_stderr": 0.02342097206916635
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156507,
"mc2": 0.4211587968245106,
"mc2_stderr": 0.01485785907132671
},
"harness|winogrande|5": {
"acc": 0.5303867403314917,
"acc_stderr": 0.014026510839428737
},
"harness|gsm8k|5": {
"acc": 0.02350265352539803,
"acc_stderr": 0.004172883669643965
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
KeithEdwardReynolds/LANDON | ---
license: openrail
---
|
Asad321/Irfan-Junejoscraped-data | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 829
num_examples: 2
download_size: 3686
dataset_size: 829
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Irfan-Junejoscraped-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/aurora_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of aurora/オーロラ/欧若拉 (Azur Lane)
This is the dataset of aurora/オーロラ/欧若拉 (Azur Lane), containing 90 images and their tags.
The core tags of this character are `blonde_hair, long_hair, green_eyes, breasts, bangs, large_breasts, very_long_hair, medium_breasts, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 90 | 194.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aurora_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 90 | 85.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aurora_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 222 | 181.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aurora_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 90 | 160.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aurora_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 222 | 293.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aurora_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/aurora_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, china_dress, smile, black_thighhighs, closed_mouth, simple_background, white_background, bare_shoulders, cleavage, full_body, high_heels, pelvic_curtain, standing, clothing_cutout, flower, folding_fan, garter_straps, holding_fan, side_slit, black_gloves, blue_dress, bridal_gauntlets, covered_navel, earrings, hair_ornament, low-tied_long_hair, panties, petals, red_dress, red_footwear |
| 1 | 17 |  |  |  |  |  | 1girl, blush, looking_at_viewer, looking_back, sweat, from_behind, smile, solo, thighs, backboob, huge_breasts, nude, veil, earrings, armlet, bracelet, curvy, thighhighs, huge_ass, on_stomach, sideboob |
| 2 | 6 |  |  |  |  |  | 1girl, blush, closed_mouth, huge_breasts, looking_at_viewer, nipples, smile, solo, sweat, thighs, veil, jewelry, nail_polish, pubic_tattoo, pussy, navel, nude, outdoors, armlet, collarbone, detached_sleeves, lips, night, piercing, see-through, stomach, thighhighs |
| 3 | 21 |  |  |  |  |  | 1girl, bare_shoulders, solo, blush, cleavage, detached_sleeves, long_sleeves, looking_at_viewer, smile, belt, pleated_skirt, black_skirt, closed_mouth, garter_straps, hair_ribbon, white_thighhighs, hair_ornament, petals, sitting, black_ribbon, bowtie, white_dress |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | blush | china_dress | smile | black_thighhighs | closed_mouth | simple_background | white_background | bare_shoulders | cleavage | full_body | high_heels | pelvic_curtain | standing | clothing_cutout | flower | folding_fan | garter_straps | holding_fan | side_slit | black_gloves | blue_dress | bridal_gauntlets | covered_navel | earrings | hair_ornament | low-tied_long_hair | panties | petals | red_dress | red_footwear | looking_back | sweat | from_behind | thighs | backboob | huge_breasts | nude | veil | armlet | bracelet | curvy | thighhighs | huge_ass | on_stomach | sideboob | nipples | jewelry | nail_polish | pubic_tattoo | pussy | navel | outdoors | collarbone | detached_sleeves | lips | night | piercing | see-through | stomach | long_sleeves | belt | pleated_skirt | black_skirt | hair_ribbon | white_thighhighs | sitting | black_ribbon | bowtie | white_dress |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:--------------|:--------|:-------------------|:---------------|:--------------------|:-------------------|:-----------------|:-----------|:------------|:-------------|:-----------------|:-----------|:------------------|:---------|:--------------|:----------------|:--------------|:------------|:---------------|:-------------|:-------------------|:----------------|:-----------|:----------------|:---------------------|:----------|:---------|:------------|:---------------|:---------------|:--------|:--------------|:---------|:-----------|:---------------|:-------|:-------|:---------|:-----------|:--------|:-------------|:-----------|:-------------|:-----------|:----------|:----------|:--------------|:---------------|:--------|:--------|:-----------|:-------------|:-------------------|:-------|:--------|:-----------|:--------------|:----------|:---------------|:-------|:----------------|:--------------|:--------------|:-------------------|:----------|:---------------|:---------|:--------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 17 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | X | X | X | X | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 3 | 21 |  |  |  |  |  | X | X | X | X | | X | | X | | | X | X | | | | | | | | X | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X |
|
miltongomez/aimodels | ---
language:
- en
pretty_name: "TCBench AI-model generated forecasts"
tags:
- AI-forecast
- Tropical Cyclone
- TCBench
- UNIL
- NetCDF
license: "mit"
---
This dataset contains the data for the TCBench AI-model generated forecasts, which are associated with the tropical cyclone (TC) benchmarking project. The data is stored in the NetCDF format.
WIP |
aryanlath/Unlabelled_Seg | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 17399091.0
num_examples: 138
download_size: 17243457
dataset_size: 17399091.0
---
# Dataset Card for "Unlabelled_Seg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_psmathur__orca_mini_v2_7b | ---
pretty_name: Evaluation run of psmathur/orca_mini_v2_7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [psmathur/orca_mini_v2_7b](https://huggingface.co/psmathur/orca_mini_v2_7b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__orca_mini_v2_7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T15:49:31.845900](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_v2_7b/blob/main/results_2023-09-22T15-49-31.845900.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.19305788590604026,\n\
\ \"em_stderr\": 0.004042077305732669,\n \"f1\": 0.2522955117449661,\n\
\ \"f1_stderr\": 0.00407273200010099,\n \"acc\": 0.371547709303585,\n\
\ \"acc_stderr\": 0.008652008076903053\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.19305788590604026,\n \"em_stderr\": 0.004042077305732669,\n\
\ \"f1\": 0.2522955117449661,\n \"f1_stderr\": 0.00407273200010099\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02880970432145565,\n \
\ \"acc_stderr\": 0.004607484283767487\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.012696531870038616\n\
\ }\n}\n```"
repo_url: https://huggingface.co/psmathur/orca_mini_v2_7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T15_49_31.845900
path:
- '**/details_harness|drop|3_2023-09-22T15-49-31.845900.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T15-49-31.845900.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T15_49_31.845900
path:
- '**/details_harness|gsm8k|5_2023-09-22T15-49-31.845900.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T15-49-31.845900.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:55:35.342185.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:55:35.342185.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:55:35.342185.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T15_49_31.845900
path:
- '**/details_harness|winogrande|5_2023-09-22T15-49-31.845900.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T15-49-31.845900.parquet'
- config_name: results
data_files:
- split: 2023_07_19T16_55_35.342185
path:
- results_2023-07-19T16:55:35.342185.parquet
- split: 2023_09_22T15_49_31.845900
path:
- results_2023-09-22T15-49-31.845900.parquet
- split: latest
path:
- results_2023-09-22T15-49-31.845900.parquet
---
# Dataset Card for Evaluation run of psmathur/orca_mini_v2_7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/orca_mini_v2_7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/orca_mini_v2_7b](https://huggingface.co/psmathur/orca_mini_v2_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__orca_mini_v2_7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T15:49:31.845900](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_v2_7b/blob/main/results_2023-09-22T15-49-31.845900.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.19305788590604026,
"em_stderr": 0.004042077305732669,
"f1": 0.2522955117449661,
"f1_stderr": 0.00407273200010099,
"acc": 0.371547709303585,
"acc_stderr": 0.008652008076903053
},
"harness|drop|3": {
"em": 0.19305788590604026,
"em_stderr": 0.004042077305732669,
"f1": 0.2522955117449661,
"f1_stderr": 0.00407273200010099
},
"harness|gsm8k|5": {
"acc": 0.02880970432145565,
"acc_stderr": 0.004607484283767487
},
"harness|winogrande|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.012696531870038616
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BeIR/fiqa | ---
annotations_creators: []
language_creators: []
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: beir
pretty_name: BEIR Benchmark
size_categories:
msmarco:
- 1M<n<10M
trec-covid:
- 100k<n<1M
nfcorpus:
- 1K<n<10K
nq:
- 1M<n<10M
hotpotqa:
- 1M<n<10M
fiqa:
- 10K<n<100K
arguana:
- 1K<n<10K
touche-2020:
- 100K<n<1M
cqadupstack:
- 100K<n<1M
quora:
- 100K<n<1M
dbpedia:
- 1M<n<10M
scidocs:
- 10K<n<100K
fever:
- 1M<n<10M
climate-fever:
- 1M<n<10M
scifact:
- 1K<n<10K
source_datasets: []
task_categories:
- text-retrieval
- zero-shot-retrieval
- information-retrieval
- zero-shot-information-retrieval
task_ids:
- passage-retrieval
- entity-linking-retrieval
- fact-checking-retrieval
- tweet-retrieval
- citation-prediction-retrieval
- duplication-question-retrieval
- argument-retrieval
- news-retrieval
- biomedical-information-retrieval
- question-answering-retrieval
---
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset. |
Back-up/test_ds_1 | ---
dataset_info:
features:
- name: question
dtype: string
- name: options
list:
- name: answer
dtype: string
- name: key
dtype: string
- name: answer
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 117374
num_examples: 103
download_size: 30084
dataset_size: 117374
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test_ds_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tippawan/test2-data-semi-p4-WLV | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
- name: prob
sequence: float64
- name: ifpass
sequence: int64
- name: pred
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 7309154
num_examples: 4764
download_size: 1014789
dataset_size: 7309154
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FreedomIntelligence/sharegpt-italian | ---
license: apache-2.0
---
Italian ShareGPT data translated by gpt-3.5-turbo.
The dataset is used in the research related to [MultilingualSIFT](https://github.com/FreedomIntelligence/MultilingualSIFT). |
NickyNicky/Spotify_Million_Song | ---
dataset_info:
features:
- name: artist
dtype: string
- name: song
dtype: string
- name: link
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 72985229
num_examples: 57650
download_size: 35080637
dataset_size: 72985229
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- en
---
```
https://www.kaggle.com/
``` |
Tarive/nepact | ---
license: openrail
---
|
arthurmluz/xlsum_data-cstnews_results | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: summary
dtype: string
- name: text
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
- name: moverScore
dtype: float64
splits:
- name: validation
num_bytes: 34559131
num_examples: 7175
download_size: 21461885
dataset_size: 34559131
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "xlsum_data-cstnews_results"
rouge={'rouge1': 0.15625934233588232, 'rouge2': 0.045078034517833404, 'rougeL': 0.09671713244776929, 'rougeLsum': 0.09671713244776929}
Bert={'precision': 0.6181117028517175, 'recall': 0.7212475901364449, 'f1': 0.665386434830855}
mover = 0.5427366803250515 |
yuan-sf63/word_label_0.2_32_P | ---
dataset_info:
features:
- name: text
dtype: string
- name: '0'
dtype: int64
- name: '1'
dtype: int64
- name: '2'
dtype: int64
- name: '3'
dtype: int64
- name: '4'
dtype: int64
- name: '5'
dtype: int64
- name: '6'
dtype: int64
- name: '7'
dtype: int64
- name: '8'
dtype: int64
- name: '9'
dtype: int64
- name: '10'
dtype: int64
- name: '11'
dtype: int64
- name: '12'
dtype: int64
- name: '13'
dtype: int64
- name: '14'
dtype: int64
- name: '15'
dtype: int64
- name: '16'
dtype: int64
- name: '17'
dtype: int64
- name: '18'
dtype: int64
- name: '19'
dtype: int64
- name: '20'
dtype: int64
- name: '21'
dtype: int64
- name: '22'
dtype: int64
- name: '23'
dtype: int64
- name: '24'
dtype: int64
- name: '25'
dtype: int64
- name: '26'
dtype: int64
- name: '27'
dtype: int64
- name: '28'
dtype: int64
- name: '29'
dtype: int64
- name: '30'
dtype: int64
- name: '31'
dtype: int64
splits:
- name: train
num_bytes: 21951104.611497793
num_examples: 64264
- name: validation
num_bytes: 2439201.3885022057
num_examples: 7141
download_size: 5729483
dataset_size: 24390306.0
---
# Dataset Card for "word_label_0.2_32_P"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
owanr/o1o2o3_xl_r2_iterater_with_human_pref | ---
dataset_info:
features:
- name: src
dtype: string
- name: tgt
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 14466316
num_examples: 35644
download_size: 2063936
dataset_size: 14466316
---
# Dataset Card for "o1o2o3_xl_r2_iterater_with_human_pref"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
metaeval/syntactic-augmentation-nli | ---
license: mit
task_ids:
- natural-language-inference
task_categories:
- text-classification
language:
- en
---
https://github.com/Aatlantise/syntactic-augmentation-nli/tree/master/datasets
```
@inproceedings{min-etal-2020-syntactic,
title = "Syntactic Data Augmentation Increases Robustness to Inference Heuristics",
author = "Min, Junghyun and
McCoy, R. Thomas and
Das, Dipanjan and
Pitler, Emily and
Linzen, Tal",
booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.acl-main.212",
doi = "10.18653/v1/2020.acl-main.212",
pages = "2339--2352",
}
``` |
maidalun1020/CrosslingualRetrievalBooksEn2Zh | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: queries
num_bytes: 5041506
num_examples: 31172
- name: corpus
num_bytes: 4804581
num_examples: 4614
download_size: 7382366
dataset_size: 9846087
---
|
JennyZZZ/guanaco-llama2-1k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 15401731
num_examples: 9846
- name: test
num_bytes: 815439
num_examples: 518
download_size: 0
dataset_size: 16217170
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
roa7n/patched_test_p_20_f_ATCaseOTCase_v4 | ---
dataset_info:
features:
- name: id
dtype: string
- name: sequence_str
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 50950318
num_examples: 139207
download_size: 4851567
dataset_size: 50950318
---
# Dataset Card for "patched_test_p_20_f_ATCaseOTCase_v4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
livinNector/wikipedia | ---
annotations_creators:
- no-annotation
language_creators:
- crowdsourced
pretty_name: Wikipedia
paperswithcode_id: null
license:
- cc-by-sa-3.0
- gfdl
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
source_datasets:
- original
multilinguality:
- multilingual
size_categories:
- n<1K
- 1K<n<10K
- 10K<n<100K
- 100K<n<1M
- 1M<n<10M
language:
- aa
- ab
- ace
- af
- ak
- als
- am
- an
- ang
- ar
- arc
- arz
- as
- ast
- atj
- av
- ay
- az
- azb
- ba
- bar
- bcl
- be
- bg
- bh
- bi
- bjn
- bm
- bn
- bo
- bpy
- br
- bs
- bug
- bxr
- ca
- cbk
- cdo
- ce
- ceb
- ch
- cho
- chr
- chy
- ckb
- co
- cr
- crh
- cs
- csb
- cu
- cv
- cy
- da
- de
- din
- diq
- dsb
- dty
- dv
- dz
- ee
- el
- eml
- en
- eo
- es
- et
- eu
- ext
- fa
- ff
- fi
- fj
- fo
- fr
- frp
- frr
- fur
- fy
- ga
- gag
- gan
- gd
- gl
- glk
- gn
- gom
- gor
- got
- gu
- gv
- ha
- hak
- haw
- he
- hi
- hif
- ho
- hr
- hsb
- ht
- hu
- hy
- ia
- id
- ie
- ig
- ii
- ik
- ilo
- inh
- io
- is
- it
- iu
- ja
- jam
- jbo
- jv
- ka
- kaa
- kab
- kbd
- kbp
- kg
- ki
- kj
- kk
- kl
- km
- kn
- ko
- koi
- krc
- ks
- ksh
- ku
- kv
- kw
- ky
- la
- lad
- lb
- lbe
- lez
- lfn
- lg
- li
- lij
- lmo
- ln
- lo
- lrc
- lt
- ltg
- lv
- lzh
- mai
- mdf
- mg
- mh
- mhr
- mi
- min
- mk
- ml
- mn
- mr
- mrj
- ms
- mt
- mus
- mwl
- my
- myv
- mzn
- na
- nah
- nan
- nap
- nds
- ne
- new
- ng
- nl
- nn
- 'no'
- nov
- nrf
- nso
- nv
- ny
- oc
- olo
- om
- or
- os
- pa
- pag
- pam
- pap
- pcd
- pdc
- pfl
- pi
- pih
- pl
- pms
- pnb
- pnt
- ps
- pt
- qu
- rm
- rmy
- rn
- ro
- ru
- rue
- rup
- rw
- sa
- sah
- sat
- sc
- scn
- sco
- sd
- se
- sg
- sgs
- sh
- si
- sk
- sl
- sm
- sn
- so
- sq
- sr
- srn
- ss
- st
- stq
- su
- sv
- sw
- szl
- ta
- tcy
- tdt
- te
- tg
- th
- ti
- tk
- tl
- tn
- to
- tpi
- tr
- ts
- tt
- tum
- tw
- ty
- tyv
- udm
- ug
- uk
- ur
- uz
- ve
- vec
- vep
- vi
- vls
- vo
- vro
- wa
- war
- wo
- wuu
- xal
- xh
- xmf
- yi
- yo
- yue
- za
- zea
- zh
- zu
language_bcp47:
- nds-nl
configs:
- 20220301.aa
- 20220301.ab
- 20220301.ace
- 20220301.ady
- 20220301.af
- 20220301.ak
- 20220301.als
- 20220301.am
- 20220301.an
- 20220301.ang
- 20220301.ar
- 20220301.arc
- 20220301.arz
- 20220301.as
- 20220301.ast
- 20220301.atj
- 20220301.av
- 20220301.ay
- 20220301.az
- 20220301.azb
- 20220301.ba
- 20220301.bar
- 20220301.bat-smg
- 20220301.bcl
- 20220301.be
- 20220301.be-x-old
- 20220301.bg
- 20220301.bh
- 20220301.bi
- 20220301.bjn
- 20220301.bm
- 20220301.bn
- 20220301.bo
- 20220301.bpy
- 20220301.br
- 20220301.bs
- 20220301.bug
- 20220301.bxr
- 20220301.ca
- 20220301.cbk-zam
- 20220301.cdo
- 20220301.ce
- 20220301.ceb
- 20220301.ch
- 20220301.cho
- 20220301.chr
- 20220301.chy
- 20220301.ckb
- 20220301.co
- 20220301.cr
- 20220301.crh
- 20220301.cs
- 20220301.csb
- 20220301.cu
- 20220301.cv
- 20220301.cy
- 20220301.da
- 20220301.de
- 20220301.din
- 20220301.diq
- 20220301.dsb
- 20220301.dty
- 20220301.dv
- 20220301.dz
- 20220301.ee
- 20220301.el
- 20220301.eml
- 20220301.en
- 20220301.eo
- 20220301.es
- 20220301.et
- 20220301.eu
- 20220301.ext
- 20220301.fa
- 20220301.ff
- 20220301.fi
- 20220301.fiu-vro
- 20220301.fj
- 20220301.fo
- 20220301.fr
- 20220301.frp
- 20220301.frr
- 20220301.fur
- 20220301.fy
- 20220301.ga
- 20220301.gag
- 20220301.gan
- 20220301.gd
- 20220301.gl
- 20220301.glk
- 20220301.gn
- 20220301.gom
- 20220301.gor
- 20220301.got
- 20220301.gu
- 20220301.gv
- 20220301.ha
- 20220301.hak
- 20220301.haw
- 20220301.he
- 20220301.hi
- 20220301.hif
- 20220301.ho
- 20220301.hr
- 20220301.hsb
- 20220301.ht
- 20220301.hu
- 20220301.hy
- 20220301.ia
- 20220301.id
- 20220301.ie
- 20220301.ig
- 20220301.ii
- 20220301.ik
- 20220301.ilo
- 20220301.inh
- 20220301.io
- 20220301.is
- 20220301.it
- 20220301.iu
- 20220301.ja
- 20220301.jam
- 20220301.jbo
- 20220301.jv
- 20220301.ka
- 20220301.kaa
- 20220301.kab
- 20220301.kbd
- 20220301.kbp
- 20220301.kg
- 20220301.ki
- 20220301.kj
- 20220301.kk
- 20220301.kl
- 20220301.km
- 20220301.kn
- 20220301.ko
- 20220301.koi
- 20220301.krc
- 20220301.ks
- 20220301.ksh
- 20220301.ku
- 20220301.kv
- 20220301.kw
- 20220301.ky
- 20220301.la
- 20220301.lad
- 20220301.lb
- 20220301.lbe
- 20220301.lez
- 20220301.lfn
- 20220301.lg
- 20220301.li
- 20220301.lij
- 20220301.lmo
- 20220301.ln
- 20220301.lo
- 20220301.lrc
- 20220301.lt
- 20220301.ltg
- 20220301.lv
- 20220301.mai
- 20220301.map-bms
- 20220301.mdf
- 20220301.mg
- 20220301.mh
- 20220301.mhr
- 20220301.mi
- 20220301.min
- 20220301.mk
- 20220301.ml
- 20220301.mn
- 20220301.mr
- 20220301.mrj
- 20220301.ms
- 20220301.mt
- 20220301.mus
- 20220301.mwl
- 20220301.my
- 20220301.myv
- 20220301.mzn
- 20220301.na
- 20220301.nah
- 20220301.nap
- 20220301.nds
- 20220301.nds-nl
- 20220301.ne
- 20220301.new
- 20220301.ng
- 20220301.nl
- 20220301.nn
- 20220301.no
- 20220301.nov
- 20220301.nrm
- 20220301.nso
- 20220301.nv
- 20220301.ny
- 20220301.oc
- 20220301.olo
- 20220301.om
- 20220301.or
- 20220301.os
- 20220301.pa
- 20220301.pag
- 20220301.pam
- 20220301.pap
- 20220301.pcd
- 20220301.pdc
- 20220301.pfl
- 20220301.pi
- 20220301.pih
- 20220301.pl
- 20220301.pms
- 20220301.pnb
- 20220301.pnt
- 20220301.ps
- 20220301.pt
- 20220301.qu
- 20220301.rm
- 20220301.rmy
- 20220301.rn
- 20220301.ro
- 20220301.roa-rup
- 20220301.roa-tara
- 20220301.ru
- 20220301.rue
- 20220301.rw
- 20220301.sa
- 20220301.sah
- 20220301.sat
- 20220301.sc
- 20220301.scn
- 20220301.sco
- 20220301.sd
- 20220301.se
- 20220301.sg
- 20220301.sh
- 20220301.si
- 20220301.simple
- 20220301.sk
- 20220301.sl
- 20220301.sm
- 20220301.sn
- 20220301.so
- 20220301.sq
- 20220301.sr
- 20220301.srn
- 20220301.ss
- 20220301.st
- 20220301.stq
- 20220301.su
- 20220301.sv
- 20220301.sw
- 20220301.szl
- 20220301.ta
- 20220301.tcy
- 20220301.te
- 20220301.tet
- 20220301.tg
- 20220301.th
- 20220301.ti
- 20220301.tk
- 20220301.tl
- 20220301.tn
- 20220301.to
- 20220301.tpi
- 20220301.tr
- 20220301.ts
- 20220301.tt
- 20220301.tum
- 20220301.tw
- 20220301.ty
- 20220301.tyv
- 20220301.udm
- 20220301.ug
- 20220301.uk
- 20220301.ur
- 20220301.uz
- 20220301.ve
- 20220301.vec
- 20220301.vep
- 20220301.vi
- 20220301.vls
- 20220301.vo
- 20220301.wa
- 20220301.war
- 20220301.wo
- 20220301.wuu
- 20220301.xal
- 20220301.xh
- 20220301.xmf
- 20220301.yi
- 20220301.yo
- 20220301.za
- 20220301.zea
- 20220301.zh
- 20220301.zh-classical
- 20220301.zh-min-nan
- 20220301.zh-yue
- 20220301.zu
dataset_info:
- config_name: 20220301.de
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 8905282792
num_examples: 2665357
download_size: 6523215105
dataset_size: 8905282792
- config_name: 20220301.en
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 20275516160
num_examples: 6458670
download_size: 20598313936
dataset_size: 20275516160
- config_name: 20220301.fr
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 7375920768
num_examples: 2402095
download_size: 5602565274
dataset_size: 7375920768
- config_name: 20220301.frr
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 9129760
num_examples: 15199
download_size: 12438017
dataset_size: 9129760
- config_name: 20220301.it
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 4539944448
num_examples: 1743035
download_size: 3516441239
dataset_size: 4539944448
- config_name: 20220301.simple
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 235072360
num_examples: 205328
download_size: 239682796
dataset_size: 235072360
---
# Dataset Card for Wikipedia
## Table of Contents
- [Dataset Card for "wikipedia"](#dataset-card-for-wikipedia)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [20200501.de](#20200501de)
- [20200501.en](#20200501en)
- [20200501.fr](#20200501fr)
- [20200501.frr](#20200501frr)
- [20200501.it](#20200501it)
- [Data Fields](#data-fields)
- [20200501.de](#20200501de-1)
- [20200501.en](#20200501en-1)
- [20200501.fr](#20200501fr-1)
- [20200501.frr](#20200501frr-1)
- [20200501.it](#20200501it-1)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://dumps.wikimedia.org](https://dumps.wikimedia.org)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Dataset Summary
Wikipedia dataset containing cleaned articles of all languages.
The datasets are built from the Wikipedia dump
(https://dumps.wikimedia.org/) with one split per language. Each example
contains the content of one full Wikipedia article with cleaning to strip
markdown and unwanted sections (references, etc.).
The articles are parsed using the ``mwparserfromhell`` tool.
To load this dataset you need to install Apache Beam and ``mwparserfromhell`` first:
```
pip install apache_beam mwparserfromhell
```
Then, you can load any subset of Wikipedia per language and per date this way:
```python
from datasets import load_dataset
load_dataset("wikipedia", language="sw", date="20220120", beam_runner=...)
```
where you can pass as `beam_runner` any Apache Beam supported runner for (distributed) data processing
(see [here](https://beam.apache.org/documentation/runners/capability-matrix/)).
Pass "DirectRunner" to run it on your machine.
You can find the full list of languages and dates [here](https://dumps.wikimedia.org/backup-index.html).
Some subsets of Wikipedia have already been processed by HuggingFace, and you can load them just with:
```python
from datasets import load_dataset
load_dataset("wikipedia", "20220301.en")
```
The list of pre-processed subsets is:
- "20220301.de"
- "20220301.en"
- "20220301.fr"
- "20220301.frr"
- "20220301.it"
- "20220301.simple"
### Supported Tasks and Leaderboards
The dataset is generally used for Language Modeling.
### Languages
You can find the list of languages [here](https://meta.wikimedia.org/wiki/List_of_Wikipedias).
## Dataset Structure
### Data Instances
An example looks as follows:
```
{'id': '1',
'url': 'https://simple.wikipedia.org/wiki/April',
'title': 'April',
'text': 'April is the fourth month...'
}
```
Some subsets of Wikipedia have already been processed by HuggingFace, as you can see below:
#### 20220301.de
- **Size of downloaded dataset files:** 6523.22 MB
- **Size of the generated dataset:** 8905.28 MB
- **Total amount of disk used:** 15428.50 MB
#### 20220301.en
- **Size of downloaded dataset files:** 20598.31 MB
- **Size of the generated dataset:** 20275.52 MB
- **Total amount of disk used:** 40873.83 MB
#### 20220301.fr
- **Size of downloaded dataset files:** 5602.57 MB
- **Size of the generated dataset:** 7375.92 MB
- **Total amount of disk used:** 12978.49 MB
#### 20220301.frr
- **Size of downloaded dataset files:** 12.44 MB
- **Size of the generated dataset:** 9.13 MB
- **Total amount of disk used:** 21.57 MB
#### 20220301.it
- **Size of downloaded dataset files:** 3516.44 MB
- **Size of the generated dataset:** 4539.94 MB
- **Total amount of disk used:** 8056.39 MB
#### 20220301.simple
- **Size of downloaded dataset files:** 239.68 MB
- **Size of the generated dataset:** 235.07 MB
- **Total amount of disk used:** 474.76 MB
### Data Fields
The data fields are the same among all configurations:
- `id` (`str`): ID of the article.
- `url` (`str`): URL of the article.
- `title` (`str`): Title of the article.
- `text` (`str`): Text content of the article.
### Data Splits
Here are the number of examples for several configurations:
| name | train |
|-----------------|--------:|
| 20220301.de | 2665357 |
| 20220301.en | 6458670 |
| 20220301.fr | 2402095 |
| 20220301.frr | 15199 |
| 20220301.it | 1743035 |
| 20220301.simple | 205328 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
Most of Wikipedia's text and many of its images are co-licensed under the
[Creative Commons Attribution-ShareAlike 3.0 Unported License](https://en.wikipedia.org/wiki/Wikipedia:Text_of_Creative_Commons_Attribution-ShareAlike_3.0_Unported_License)
(CC BY-SA) and the [GNU Free Documentation License](https://en.wikipedia.org/wiki/Wikipedia:Text_of_the_GNU_Free_Documentation_License)
(GFDL) (unversioned, with no invariant sections, front-cover texts, or back-cover texts).
Some text has been imported only under CC BY-SA and CC BY-SA-compatible license and cannot be reused under GFDL; such
text will be identified on the page footer, in the page history, or on the discussion page of the article that utilizes
the text.
### Citation Information
```
@ONLINE{wikidump,
author = "Wikimedia Foundation",
title = "Wikimedia Downloads",
url = "https://dumps.wikimedia.org"
}
```
### Contributions
Thanks to [@lewtun](https://github.com/lewtun), [@mariamabarham](https://github.com/mariamabarham), [@thomwolf](https://github.com/thomwolf), [@lhoestq](https://github.com/lhoestq), [@patrickvonplaten](https://github.com/patrickvonplaten) for adding this dataset.
|
CVasNLPExperiments/textvqa_mini_validation_google_flan_t5_small_mode_OCR_VQA_Q_rices_ns_10 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 14992
num_examples: 10
download_size: 7039
dataset_size: 14992
configs:
- config_name: default
data_files:
- split: fewshot_0
path: data/fewshot_0-*
---
|
huggingartists/our-last-night | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/our-last-night"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.287611 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/03627944481dcdb782595e9d3e351853.959x959x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/our-last-night">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Our Last Night</div>
<a href="https://genius.com/artists/our-last-night">
<div style="text-align: center; font-size: 14px;">@our-last-night</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/our-last-night).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/our-last-night")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|179| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/our-last-night")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
Asap7772/subreddit_data | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: title
dtype: string
- name: selftext
dtype: string
- name: score
dtype: int64
- name: num_comments
dtype: int64
- name: upvote_ratio
dtype: float64
- name: created_utc
dtype: float64
- name: subreddit
dtype: string
splits:
- name: train
num_bytes: 3576983.349282297
num_examples: 1128
- name: test
num_bytes: 399556.65071770333
num_examples: 126
download_size: 2489160
dataset_size: 3976540.0
---
# Dataset Card for "subreddit_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sina-Alinejad-2002/span_operation_prediction | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 199664
num_examples: 190
- name: validation
num_bytes: 10300
num_examples: 12
download_size: 151410
dataset_size: 209964
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
YUiCHl/scale | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 2701824843.5
num_examples: 12474
download_size: 2691121809
dataset_size: 2701824843.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yzhuang/autotree_automl_bank-marketing_gosdt_l512_d3 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 5538400000
num_examples: 100000
- name: validation
num_bytes: 553840000
num_examples: 10000
download_size: 809182690
dataset_size: 6092240000
---
# Dataset Card for "autotree_automl_bank-marketing_gosdt_l512_d3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
karynaur/xpr_multilingual | ---
dataset_info:
features:
- name: query
dtype: string
- name: positive
dtype: string
- name: negative
dtype: string
- name: lang
dtype: string
splits:
- name: train
num_bytes: 3223924813.78354
num_examples: 5889945
- name: test
num_bytes: 1381682532.2164602
num_examples: 2524263
download_size: 3507352983
dataset_size: 4605607346
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
license: mit
--- |
Aarya4536/therapy-bot-data-10k | ---
dataset_info:
features:
- name: question
dtype: string
- name: response_j
dtype: string
- name: response_k
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 8439579
num_examples: 10507
download_size: 3516308
dataset_size: 8439579
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/code_instructions_standardized_cluster_0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 76386019
num_examples: 7064
download_size: 23724662
dataset_size: 76386019
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_instructions_standardized_cluster_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
uclgroup8/early-exit-iemocap-embeddings | ---
dataset_info:
features:
- name: emotion
dtype: string
- name: to_translate
dtype: string
- name: early_audio_embeddings
sequence:
sequence: float64
- name: audio_embeddings
sequence:
sequence: float64
splits:
- name: train
num_bytes: 68067986
num_examples: 5501
- name: test
num_bytes: 8511732
num_examples: 688
- name: val
num_bytes: 8513280
num_examples: 688
download_size: 70763985
dataset_size: 85092998
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: val
path: data/val-*
---
|
GammaKing2000/Music_LLM_train_dataset | ---
dataset_info:
features:
- name: entries
dtype: string
splits:
- name: train
num_bytes: 16481557
num_examples: 7209
download_size: 8660733
dataset_size: 16481557
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lhallee/Thermostability_reg | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
dataset_info:
features:
- name: seqs
dtype: string
- name: labels
dtype: float64
splits:
- name: train
num_bytes: 2990210
num_examples: 5056
- name: valid
num_bytes: 373605
num_examples: 639
- name: test
num_bytes: 795351
num_examples: 1336
download_size: 4142780
dataset_size: 4159166
---
# Dataset Card for "Thermostability_reg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
determined-ai/arxiv_abstracts_2021_short | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: title
dtype: string
- name: abstract
dtype: string
splits:
- name: train
num_bytes: 98194924
num_examples: 261634
download_size: 60007305
dataset_size: 98194924
---
# Dataset Card for "arxiv_abstracts_2021_short"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
achinthani/emotion-custom | ---
size_categories: n<1K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for emotion-custom
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("achinthani/emotion-custom")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("achinthani/emotion-custom")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| text | Text | text | True | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| sentiment | Sentiment | label_selection | True | N/A | ['positive', 'neutral', 'negative'] |
| mixed-emotion | Mixed-emotion | multi_label_selection | True | N/A | ['joy', 'anger', 'sadness', 'fear', 'surprise', 'love'] |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"text": "i didnt feel humiliated"
},
"metadata": {},
"responses": [
{
"status": "submitted",
"user_id": "1566e368-1256-40f4-9dbf-a022ba5d117c",
"values": {
"mixed-emotion": {
"value": [
"anger"
]
},
"sentiment": {
"value": "positive"
}
}
}
],
"suggestions": [],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": null,
"metadata": "{}",
"mixed-emotion": [
{
"status": "submitted",
"user_id": "1566e368-1256-40f4-9dbf-a022ba5d117c",
"value": [
"anger"
]
}
],
"mixed-emotion-suggestion": null,
"mixed-emotion-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"sentiment": [
{
"status": "submitted",
"user_id": "1566e368-1256-40f4-9dbf-a022ba5d117c",
"value": "positive"
}
],
"sentiment-suggestion": null,
"sentiment-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"text": "i didnt feel humiliated"
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **text** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **sentiment** is of type `label_selection` with the following allowed values ['positive', 'neutral', 'negative'].
* **mixed-emotion** is of type `multi_label_selection` with the following allowed values ['joy', 'anger', 'sadness', 'fear', 'surprise', 'love'].
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **sentiment-suggestion** is of type `label_selection` with the following allowed values ['positive', 'neutral', 'negative'].
* (optional) **mixed-emotion-suggestion** is of type `multi_label_selection` with the following allowed values ['joy', 'anger', 'sadness', 'fear', 'surprise', 'love'].
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
Emotion is a dataset of English Twitter messages with six basic emotions: anger, fear, joy, love, sadness, and surprise.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
PeacefulData/ATIS-Self-Align-v0 | ---
license: mit
language:
- en
size_categories:
- 1K<n<10K
--- |
vvtq/val_2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: pose
dtype: image
- name: image_caption
dtype: string
splits:
- name: train
num_bytes: 216544.0
num_examples: 2
download_size: 241613
dataset_size: 216544.0
---
# Dataset Card for "val_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/saint_louis_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of saint_louis/サン・ルイ/路易九世 (Azur Lane)
This is the dataset of saint_louis/サン・ルイ/路易九世 (Azur Lane), containing 265 images and their tags.
The core tags of this character are `breasts, long_hair, red_eyes, grey_hair, large_breasts, mole, mole_under_eye, hair_between_eyes, bangs, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 265 | 479.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saint_louis_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 265 | 239.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saint_louis_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 683 | 513.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saint_louis_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 265 | 411.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saint_louis_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 683 | 765.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saint_louis_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/saint_louis_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, corset, gauntlets, holding_polearm, pleated_skirt, solo, white_skirt, looking_at_viewer, spear, miniskirt, breastplate, pantyhose, simple_background, white_background, diamond_(shape) |
| 1 | 7 |  |  |  |  |  | 1girl, breastplate, cannon, corset, diamond_(shape), gauntlets, holding_polearm, left-handed, miniskirt, pantyhose, pleated_skirt, rigging, solo, spear, turret, white_skirt, machinery, unitard, standing, white_footwear, looking_at_viewer, ribbon |
| 2 | 17 |  |  |  |  |  | 1girl, elbow_gloves, sleeveless_dress, solo, white_dress, white_gloves, bare_shoulders, cleavage, fingerless_gloves, looking_at_viewer, white_thighhighs, cross_earrings, china_dress, butterfly, sitting, blue_scarf, evening_gown, flower, thighs |
| 3 | 5 |  |  |  |  |  | 1girl, black_dress, china_dress, cleavage, garter_straps, hair_flower, looking_at_viewer, official_alternate_costume, parted_lips, solo, thighs, black_thighhighs, blush, holding_fan, see-through, sitting, black_gloves, couch, covered_navel, feather_boa, short_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | corset | gauntlets | holding_polearm | pleated_skirt | solo | white_skirt | looking_at_viewer | spear | miniskirt | breastplate | pantyhose | simple_background | white_background | diamond_(shape) | cannon | left-handed | rigging | turret | machinery | unitard | standing | white_footwear | ribbon | elbow_gloves | sleeveless_dress | white_dress | white_gloves | bare_shoulders | cleavage | fingerless_gloves | white_thighhighs | cross_earrings | china_dress | butterfly | sitting | blue_scarf | evening_gown | flower | thighs | black_dress | garter_straps | hair_flower | official_alternate_costume | parted_lips | black_thighhighs | blush | holding_fan | see-through | black_gloves | couch | covered_navel | feather_boa | short_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:------------|:------------------|:----------------|:-------|:--------------|:--------------------|:--------|:------------|:--------------|:------------|:--------------------|:-------------------|:------------------|:---------|:--------------|:----------|:---------|:------------|:----------|:-----------|:-----------------|:---------|:---------------|:-------------------|:--------------|:---------------|:-----------------|:-----------|:--------------------|:-------------------|:-----------------|:--------------|:------------|:----------|:-------------|:---------------|:---------|:---------|:--------------|:----------------|:--------------|:-----------------------------|:--------------|:-------------------|:--------|:--------------|:--------------|:---------------|:--------|:----------------|:--------------|:----------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 17 |  |  |  |  |  | X | | | | | X | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
zachary-shah/musdb18-spec-pix2pix-test | ---
dataset_info:
features:
- name: original_prompt
dtype: string
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_prompt
dtype: string
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 18297334.0
num_examples: 196
download_size: 18266177
dataset_size: 18297334.0
---
# Dataset Card for "musdb18-spec-pix2pix-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lmms-lab/ChartQA | ---
dataset_info:
features:
- name: type
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: image
dtype: image
splits:
- name: test
num_bytes: 122161182.0
num_examples: 2500
download_size: 72610993
dataset_size: 122161182.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
<p align="center" width="100%">
<img src="https://i.postimg.cc/g0QRgMVv/WX20240228-113337-2x.png" width="100%" height="80%">
</p>
# Large-scale Multi-modality Models Evaluation Suite
> Accelerating the development of large-scale multi-modality models (LMMs) with `lmms-eval`
🏠 [Homepage](https://lmms-lab.github.io/) | 📚 [Documentation](docs/README.md) | 🤗 [Huggingface Datasets](https://huggingface.co/lmms-lab)
# This Dataset
This is a formatted version of [ChartQA](https://github.com/vis-nlp/ChartQA). It is used in our `lmms-eval` pipeline to allow for one-click evaluations of large multi-modality models.
```
@article{masry2022chartqa,
title={ChartQA: A benchmark for question answering about charts with visual and logical reasoning},
author={Masry, Ahmed and Long, Do Xuan and Tan, Jia Qing and Joty, Shafiq and Hoque, Enamul},
journal={arXiv preprint arXiv:2203.10244},
year={2022}
}
``` |
Seanxh/twitter_dataset_1713208118 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 152562
num_examples: 357
download_size: 56463
dataset_size: 152562
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sugeun/legalfi | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 166499295
num_examples: 157433
download_size: 71556009
dataset_size: 166499295
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "legalfi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
language-and-voice-lab/samromur_children | ---
annotations_creators:
- crowdsourced
language:
- is
language_creators:
- crowdsourced
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: "Samrómur Children Icelandic Speech 1.0"
size_categories:
- 100K<n<1M
source_datasets:
- original
tags:
- "samromur"
- children's speech
- 'icelandic: iceland'
- icelandic children
- icelandic kids
- kids
task_categories:
- automatic-speech-recognition
task_ids: []
---
# Dataset Card for samromur_children
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Samrómur Children Icelandic Speech 1.0](https://samromur.is/)
- **Repository:** [LDC](https://catalog.ldc.upenn.edu/LDC2022S11)
- **Paper:** [Samrómur Children: An Icelandic Speech Corpus](https://aclanthology.org/2022.lrec-1.105.pdf)
- **Point of Contact:** [Carlos Mena](mailto:carlos.mena@ciempiess.org), [Jón Guðnason](mailto:jg@ru.is)
### Dataset Summary
The Samrómur Children Corpus consists of audio recordings and metadata files containing prompts read by the participants. It contains more than 137000 validated speech-recordings uttered by Icelandic children.
The corpus is a result of the crowd-sourcing effort run by the Language and Voice Lab (LVL) at the Reykjavik University, in cooperation with Almannarómur, Center for Language Technology. The recording process has started in October 2019 and continues to this day (Spetember 2021).
### Example Usage
The Samrómur Children Corpus is divided in 3 splits: train, validation and test. To load a specific split pass its name as a config name:
```python
from datasets import load_dataset
samromur_children = load_dataset("language-and-voice-lab/samromur_children")
```
To load an specific split (for example, the validation split) do:
```python
from datasets import load_dataset
samromur_children = load_dataset("language-and-voice-lab/samromur_children",split="validation")
```
### Supported Tasks
automatic-speech-recognition: The dataset can be used to train a model for Automatic Speech Recognition (ASR). The model is presented with an audio file and asked to transcribe the audio file to written text. The most common evaluation metric is the word error rate (WER).
### Languages
The audio is in Icelandic.
The reading prompts were gathered from a variety of sources, mainly from the [Icelandic Gigaword Corpus](http://clarin.is/en/resources/gigaword). The corpus includes text from novels, news, plays, and from a list of location names in Iceland. The prompts also came from the [Icelandic Web of Science](https://www.visindavefur.is/).
## Dataset Structure
### Data Instances
```python
{
'audio_id': '015652-0717240',
'audio': {
'path': '/home/carlos/.cache/HuggingFace/datasets/downloads/extracted/2c6b0d82de2ef0dc0879732f726809cccbe6060664966099f43276e8c94b03f2/test/015652/015652-0717240.flac',
'array': array([ 0. , 0. , 0. , ..., -0.00311279,
-0.0007019 , 0.00128174], dtype=float32),
'sampling_rate': 16000
},
'speaker_id': '015652',
'gender': 'female',
'age': '11',
'duration': 4.179999828338623,
'normalized_text': 'eiginlega var hann hin unga rússneska bylting lifandi komin'
}
```
### Data Fields
* `audio_id` (string) - id of audio segment
* `audio` (datasets.Audio) - a dictionary containing the path to the audio, the decoded audio array, and the sampling rate. In non-streaming mode (default), the path points to the locally extracted audio. In streaming mode, the path is the relative path of an audio inside its archive (as files are not downloaded and extracted locally).
* `speaker_id` (string) - id of speaker
* `gender` (string) - gender of speaker (male or female)
* `age` (string) - range of age of the speaker: Younger (15-35), Middle-aged (36-60) or Elderly (61+).
* `duration` (float32) - duration of the audio file in seconds.
* `normalized_text` (string) - normalized audio segment transcription.
### Data Splits
The corpus is split into train, dev, and test portions. Lenghts of every portion are: train = 127h25m, test = 1h50m, dev=1h50m.
To load an specific portion please see the above section "Example Usage".
## Dataset Creation
### Curation Rationale
In the field of Automatic Speech Recognition (ASR) is a known fact that the children's speech is particularly hard to recognise due to its high variability produced by developmental changes in children's anatomy and speech production skills.
For this reason, the criteria of selection for the train/dev/test portions have to take into account the children's age. Nevertheless, the Samrómur Children is an unbalanced corpus in terms of gender and age of the speakers. This means that the corpus has, for example, a total of 1667 female speakers (73h38m) versus 1412 of male speakers (52h26m).
These unbalances impose conditions in the type of the experiments than can be performed with the corpus. For example, a equal number of female and male speakers through certain ranges of age is impossible. So, if one can't have a perfectly balance corpus in the training set, at least one can have it in the test portion.
The test portion of the Samrómur Children was meticulously selected to cover ages between 6 to 16 years in both female and male speakers. Every of these range of age in both genders have a total duration of 5 minutes each.
The development portion of the corpus contains only speakers with an unknown gender information. Both test and dev sets have a total duration of 1h50m each.
In order to perform fairer experiments, speakers in the train and test sets are not shared. Nevertheless, there is only one speaker shared between the train and development set. It can be identified with the speaker ID=010363. However, no audio files are shared between these two sets.
### Source Data
#### Initial Data Collection and Normalization
The data was collected using the website https://samromur.is, code of which is available at https://github.com/cadia-lvl/samromur. The age range selected for this corpus is between 4 and 17 years.
The original audio was collected at 44.1 kHz or 48 kHz sampling rate as *.wav files, which was down-sampled to 16 kHz and converted to *.flac. Each recording contains one read sentence from a script. The script contains 85.080 unique sentences and 90.838 unique tokens.
There was no identifier other than the session ID, which is used as the speaker ID. The corpus is distributed with a metadata file with a detailed information on each utterance and speaker. The madata file is encoded as UTF-8 Unicode.
The prompts were gathered from a variety of sources, mainly from The Icelandic Gigaword Corpus, which is available at http://clarin.is/en/resources/gigaword. The corpus includes text from novels, news, plays, and from a list of location names in Iceland. The prompts also came from the [Icelandic Web of Science](https://www.visindavefur.is/).
### Annotations
#### Annotation process
Prompts were pulled from these corpora if they met the criteria of having only letters which are present in the Icelandic alphabet, and if they are listed in the [DIM: Database Icelandic Morphology](https://aclanthology.org/W19-6116.pdf).
There are also synthesised prompts consisting of a name followed by a question or a demand, in order to simulate a dialogue with a smart-device.
#### Who are the annotators?
The audio files content was manually verified against the prompts by one or more listener (summer students mainly).
### Personal and Sensitive Information
The dataset consists of people who have donated their voice. You agree to not attempt to determine the identity of speakers in this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This is the first ASR corpus of Icelandic children.
### Discussion of Biases
* The utterances were recorded by a smartphone or the web app.
* Participants self-reported their age group, gender, and the native language.
* Participants are aged between 4 to 17 years.
* The corpus contains 137597 utterances from 3175 speakers, totalling 131 hours.
* The amount of data due to female speakers is 73h38m, the amount of data due to male speakers is 52h26m and the amount of data due to speakers with an unknown gender information is 05h02m
* The number of female speakers is 1667, the number of male speakers is 1412. The number of speakers with an unknown gender information is 96.
* The audios due to female speakers are 78993, the audios due to male speakers are 53927 and the audios due to speakers with an unknown gender information are 4677.
### Other Known Limitations
"Samrómur Children: Icelandic Speech 21.09" by the Language and Voice Laboratory (LVL) at the Reykjavik University is licensed under a Creative Commons Attribution 4.0 International (CC BY 4.0) License with the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
## Additional Information
### Dataset Curators
The corpus is a result of the crowd-sourcing effort run by the Language and Voice Lab (LVL) at the Reykjavik University, in cooperation with Almannarómur, Center for Language Technology. The recording process has started in October 2019 and continues to this day (Spetember 2021). The corpus was curated by Carlos Daniel Hernández Mena in 2021.
### Licensing Information
[CC-BY-4.0](https://creativecommons.org/licenses/by/4.0/)
### Citation Information
```
@misc{menasamromurchildren2021,
title={Samrómur Children Icelandic Speech 1.0},
ldc_catalog_no={LDC2022S11},
DOI={https://doi.org/10.35111/frrj-qd60},
author={Hernández Mena, Carlos Daniel and Borsky, Michal and Mollberg, David Erik and Guðmundsson, Smári Freyr and Hedström, Staffan and Pálsson, Ragnar and Jónsson, Ólafur Helgi and Þorsteinsdóttir, Sunneva and Guðmundsdóttir, Jóhanna Vigdís and Magnúsdóttir, Eydís Huld and Þórhallsdóttir, Ragnheiður and Guðnason, Jón},
publisher={Reykjavík University},
journal={Linguistic Data Consortium, Philadelphia},
year={2021},
url={https://catalog.ldc.upenn.edu/LDC2022S11},
}
```
### Contributions
This project was funded by the Language Technology Programme for Icelandic 2019-2023. The programme, which is managed and coordinated by Almannarómur, is funded by the Icelandic Ministry of Education, Science and Culture.
The verification for the dataset was funded by the the Icelandic Directorate of Labour's Student Summer Job Program in 2020 and 2021.
Special thanks for the summer students for all the hard work.
|
simpledia/simpledia | ---
dataset_info:
- config_name: auto_math_text
features:
- name: prompt
dtype: string
- name: text_token_length
dtype: int64
- name: text
dtype: string
- name: seed_data
dtype: string
- name: format
dtype: string
- name: audience
dtype: string
splits:
- name: train
num_bytes: 8777587297.907892
num_examples: 1949895
download_size: 4461401898
dataset_size: 8777587297.907892
- config_name: khanacademy
features:
- name: prompt
dtype: string
- name: text_token_length
dtype: int64
- name: text
dtype: string
- name: seed_data
dtype: string
- name: format
dtype: string
- name: audience
dtype: string
splits:
- name: train
num_bytes: 108591354.09210858
num_examples: 24123
download_size: 49139761
dataset_size: 108591354.09210858
- config_name: openstax
features:
- name: text_token_length
dtype: int64
- name: prompt
dtype: string
- name: text
dtype: string
- name: seed_data
dtype: string
- name: format
dtype: string
- name: audience
dtype: string
splits:
- name: train
num_bytes: 667837450
num_examples: 126332
download_size: 346992522
dataset_size: 667837450
- config_name: stanford
features:
- name: text_token_length
dtype: int64
- name: prompt
dtype: string
- name: text
dtype: string
- name: seed_data
dtype: string
- name: format
dtype: string
- name: audience
dtype: string
splits:
- name: train
num_bytes: 6341291506
num_examples: 1020024
download_size: 3302284560
dataset_size: 6341291506
# - config_name: stories
# features:
# - name: text
# dtype: string
# - name: prompt
# dtype: string
# - name: text_token_length
# dtype: int64
# - name: seed_data
# dtype: string
# - name: format
# dtype: string
# - name: audience
# dtype: string
# splits:
# - name: train
# num_bytes: 21314739648
# num_examples: 4992964
# download_size: 11902294709
# dataset_size: 21314739648
- config_name: web_sample
features:
- name: text_token_length
dtype: int64
- name: prompt
dtype: string
- name: text
dtype: string
- name: seed_data
dtype: string
- name: format
dtype: string
- name: audience
dtype: string
splits:
- name: train
num_bytes: 69075726295
num_examples: 12426348
download_size: 38978124936
dataset_size: 69075726295
# - config_name: web_samples_v2
# features:
# - name: text_token_length
# dtype: int64
# - name: prompt
# dtype: string
# - name: text
# dtype: string
# - name: seed_data
# dtype: string
# - name: format
# dtype: string
# - name: audience
# dtype: string
# splits:
# - name: train
# num_bytes: 58711802939
# num_examples: 10345867
# download_size: 32658254617
# dataset_size: 58711802939
- config_name: wikihow
features:
- name: text_token_length
dtype: int64
- name: prompt
dtype: string
- name: text
dtype: string
- name: seed_data
dtype: string
- name: format
dtype: string
- name: audience
dtype: string
splits:
- name: train
num_bytes: 892720528
num_examples: 179191
download_size: 502284600
dataset_size: 892720528
configs:
- config_name: auto_math_text
data_files:
- split: train
path: data/auto_math_text/train-*
- config_name: khanacademy
data_files:
- split: train
path: data/khanacademy/train-*
- config_name: openstax
data_files:
- split: train
path: data/openstax/train-*
- config_name: stanford
data_files:
- split: train
path: data/stanford/train-*
# - config_name: stories
# data_files:
# - split: train
# path: data/stories/train-*
- config_name: web_samples_v1
data_files:
- split: train
path: data/web_sample/train-*
# - config_name:
- config_name: wikihow
data_files:
- split: train
path: data/wikihow/train-*
license: apache-2.0
language:
- en
tags:
- synthetic
--- |
ndhieunguyen/LPM-24 | ---
dataset_info:
features:
- name: image
dtype: image
- name: id
dtype: int64
- name: canonical
dtype: string
- name: selfies
dtype: string
- name: caption
dtype: string
splits:
- name: train
num_bytes: 2269140389.488
num_examples: 126864
- name: validation
num_bytes: 597161224.016
num_examples: 33696
download_size: 2757974974
dataset_size: 2866301613.5039997
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
Nicolas-BZRD/CAPP_opendata | ---
language:
- fr
license: odc-by
size_categories:
- 10K<n<100K
pretty_name: Fonds documentaire de jurisprudence des cours d’appel et des juridictions
de premier degré
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 988217109
num_examples: 72703
download_size: 459322605
dataset_size: 988217109
tags:
- legal
---
# CAPP (case law from appeal courts and courts of first instance)
[Documentary collection of case law from appeal courts and courts of first instance](https://www.data.gouv.fr/en/datasets/capp/), including
a selection of decisions in civil and criminal matters.
Decisions are selected by the courts in accordance with decree no. 2005-13 of January 7, 2005, amending the judicial organization code (regulatory part) and relating to the documentation service.
the code de l'organisation judiciaire (regulatory part) and relating to the Service de documentation, des études
et du rapport de la Cour de cassation.
Priority: since 1997. |
olm/olm-wikipedia-20221001 | ---
annotations_creators:
- no-annotation
language:
- en
language_creators:
- found
license: []
multilinguality:
- monolingual
pretty_name: OLM October 2022 Wikipedia
size_categories:
- 1M<n<10M
source_datasets: []
tags:
- pretraining
- language modelling
- wikipedia
- web
task_categories: []
task_ids: []
---
# Dataset Card for OLM October 2022 Wikipedia
Pretraining dataset, created with the OLM repo [here](https://github.com/huggingface/olm-datasets) from an October 2022 Wikipedia snapshot. |
sethapun/arithmetic_2as_1to2 | ---
dataset_info:
features:
- name: expression
dtype: string
- name: answer
dtype: int64
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: train
num_bytes: 54000
num_examples: 2000
- name: validation
num_bytes: 10800
num_examples: 400
download_size: 6591
dataset_size: 64800
---
# Dataset Card for "arithmetic_2as_1to2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-49000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 651306
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kephalian/Ear_drum_identification | ---
license: apache-2.0
task_categories:
- object-detection
language:
- en
size_categories:
- n<1K
---
This is a Yolo format dataset with images annotated using Roboflow.
All the images are of healthy, normal human ear drums or tympanic membranes.
Both right and left tympanic membranes are included.
The idea was to create a model to identify normal versus diseased ear drums (mostly by the absence of light reflex).
The model was able to reach 100% accuracy with this dataset in correctly identifying the presence of light reflex. |
guidevit/python_code_summarization | ---
license: apache-2.0
---
|
Erickbarbosa/Eumesmo | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_aiplanet__effi-7b | ---
pretty_name: Evaluation run of aiplanet/effi-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [aiplanet/effi-7b](https://huggingface.co/aiplanet/effi-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aiplanet__effi-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-16T00:38:54.872293](https://huggingface.co/datasets/open-llm-leaderboard/details_aiplanet__effi-7b/blob/main/results_2023-10-16T00-38-54.872293.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.0003921042190298541,\n \"f1\": 0.06146078020134238,\n\
\ \"f1_stderr\": 0.0013862861484435665,\n \"acc\": 0.37858887140948305,\n\
\ \"acc_stderr\": 0.008690432281689055\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298541,\n\
\ \"f1\": 0.06146078020134238,\n \"f1_stderr\": 0.0013862861484435665\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03184230477634572,\n \
\ \"acc_stderr\": 0.004836348558260928\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7253354380426204,\n \"acc_stderr\": 0.012544516005117185\n\
\ }\n}\n```"
repo_url: https://huggingface.co/aiplanet/effi-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_16T00_38_54.872293
path:
- '**/details_harness|drop|3_2023-10-16T00-38-54.872293.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-16T00-38-54.872293.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_16T00_38_54.872293
path:
- '**/details_harness|gsm8k|5_2023-10-16T00-38-54.872293.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-16T00-38-54.872293.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_16T00_38_54.872293
path:
- '**/details_harness|winogrande|5_2023-10-16T00-38-54.872293.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-16T00-38-54.872293.parquet'
- config_name: results
data_files:
- split: 2023_10_16T00_38_54.872293
path:
- results_2023-10-16T00-38-54.872293.parquet
- split: latest
path:
- results_2023-10-16T00-38-54.872293.parquet
---
# Dataset Card for Evaluation run of aiplanet/effi-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/aiplanet/effi-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [aiplanet/effi-7b](https://huggingface.co/aiplanet/effi-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aiplanet__effi-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T00:38:54.872293](https://huggingface.co/datasets/open-llm-leaderboard/details_aiplanet__effi-7b/blob/main/results_2023-10-16T00-38-54.872293.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298541,
"f1": 0.06146078020134238,
"f1_stderr": 0.0013862861484435665,
"acc": 0.37858887140948305,
"acc_stderr": 0.008690432281689055
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298541,
"f1": 0.06146078020134238,
"f1_stderr": 0.0013862861484435665
},
"harness|gsm8k|5": {
"acc": 0.03184230477634572,
"acc_stderr": 0.004836348558260928
},
"harness|winogrande|5": {
"acc": 0.7253354380426204,
"acc_stderr": 0.012544516005117185
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
HL121/stat453_dataset | ---
dataset_info:
features:
- name: source_img
dtype: image
- name: instruction
dtype: string
- name: target_img
dtype: image
splits:
- name: train
num_bytes: 678834941.8
num_examples: 2800
download_size: 695006048
dataset_size: 678834941.8
---
# Dataset Card for "stat453_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
valdineiarcenio/gustavoclonagem2 | ---
license: openrail
---
|
sankettgorey/donut_3 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 166935733.91680533
num_examples: 540
- name: test
num_bytes: 19420774.083194677
num_examples: 61
download_size: 145179159
dataset_size: 186356508.0
---
# Dataset Card for "donut_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
316usman/thematic4c | ---
license: bsd
dataset_info:
features:
- name: text
dtype: string
- name: thematic
dtype: string
- name: sub-thematic
dtype: string
- name: country
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
splits:
- name: train
num_bytes: 72085984
num_examples: 106499
download_size: 23843869
dataset_size: 72085984
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Lollitor/CID87 | ---
dataset_info:
config_name: Lollitor
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4573
num_examples: 109
download_size: 2375
dataset_size: 4573
configs:
- config_name: Lollitor
data_files:
- split: train
path: Lollitor/train-*
---
# Dataset Card for "CID87"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_50_1713216148 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 1354276
num_examples: 3381
download_size: 687636
dataset_size: 1354276
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lukekim420/sshsbamboobot_data | ---
license: apache-2.0
---
|
CyberHarem/umeki_otoha_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of umeki_otoha/梅木音葉 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of umeki_otoha/梅木音葉 (THE iDOLM@STER: Cinderella Girls), containing 78 images and their tags.
The core tags of this character are `blonde_hair, short_hair, breasts, green_eyes, blue_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 78 | 107.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/umeki_otoha_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 78 | 62.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/umeki_otoha_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 174 | 128.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/umeki_otoha_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 78 | 94.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/umeki_otoha_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 174 | 181.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/umeki_otoha_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/umeki_otoha_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, solo, blush, smile, looking_at_viewer, hat, open_mouth, skirt, white_background, microphone |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blush | smile | looking_at_viewer | hat | open_mouth | skirt | white_background | microphone |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------|:--------------------|:------|:-------------|:--------|:-------------------|:-------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X |
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-134000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1027187
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-lener_br-lener_br-b36dee-1776161639 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- lener_br
eval_info:
task: entity_extraction
model: Luciano/bertimbau-base-lener-br-finetuned-lener-br
metrics: []
dataset_name: lener_br
dataset_config: lener_br
dataset_split: validation
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: Luciano/bertimbau-base-lener-br-finetuned-lener-br
* Dataset: lener_br
* Config: lener_br
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Luciano](https://huggingface.co/Luciano) for evaluating this model. |
one-sec-cv12/chunk_157 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 17014134816.25
num_examples: 177142
download_size: 15023435045
dataset_size: 17014134816.25
---
# Dataset Card for "chunk_157"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-OpenOrca_20w | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-OpenOrca_20w
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-OpenOrca_20w](https://huggingface.co/CHIH-HUNG/llama-2-13b-OpenOrca_20w)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-OpenOrca_20w\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T01:14:55.229555](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-OpenOrca_20w/blob/main/results_2023-10-18T01-14-55.229555.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.14953859060402686,\n\
\ \"em_stderr\": 0.0036521078888639676,\n \"f1\": 0.20982382550335602,\n\
\ \"f1_stderr\": 0.003706029190176112,\n \"acc\": 0.44925660000490675,\n\
\ \"acc_stderr\": 0.010476365550372343\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.14953859060402686,\n \"em_stderr\": 0.0036521078888639676,\n\
\ \"f1\": 0.20982382550335602,\n \"f1_stderr\": 0.003706029190176112\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12661106899166036,\n \
\ \"acc_stderr\": 0.009159715283081094\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.011793015817663592\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-OpenOrca_20w
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T01_14_55.229555
path:
- '**/details_harness|drop|3_2023-10-18T01-14-55.229555.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T01-14-55.229555.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T01_14_55.229555
path:
- '**/details_harness|gsm8k|5_2023-10-18T01-14-55.229555.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T01-14-55.229555.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T01_14_55.229555
path:
- '**/details_harness|winogrande|5_2023-10-18T01-14-55.229555.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T01-14-55.229555.parquet'
- config_name: results
data_files:
- split: 2023_10_18T01_14_55.229555
path:
- results_2023-10-18T01-14-55.229555.parquet
- split: latest
path:
- results_2023-10-18T01-14-55.229555.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-OpenOrca_20w
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-OpenOrca_20w
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-OpenOrca_20w](https://huggingface.co/CHIH-HUNG/llama-2-13b-OpenOrca_20w) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-OpenOrca_20w",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T01:14:55.229555](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-OpenOrca_20w/blob/main/results_2023-10-18T01-14-55.229555.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.14953859060402686,
"em_stderr": 0.0036521078888639676,
"f1": 0.20982382550335602,
"f1_stderr": 0.003706029190176112,
"acc": 0.44925660000490675,
"acc_stderr": 0.010476365550372343
},
"harness|drop|3": {
"em": 0.14953859060402686,
"em_stderr": 0.0036521078888639676,
"f1": 0.20982382550335602,
"f1_stderr": 0.003706029190176112
},
"harness|gsm8k|5": {
"acc": 0.12661106899166036,
"acc_stderr": 0.009159715283081094
},
"harness|winogrande|5": {
"acc": 0.7719021310181531,
"acc_stderr": 0.011793015817663592
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CVasNLPExperiments/OxfordPets_test_google_flan_t5_xl_mode_T_SPECIFIC_A_ns_3669 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_Attributes_ViT_L_14_text_davinci_003_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 1535357
num_examples: 3669
- name: fewshot_0_clip_tags_ViT_L_14_Attributes_ViT_L_14_text_davinci_003_full_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 1535404
num_examples: 3669
- name: fewshot_3_clip_tags_ViT_L_14_Attributes_ViT_L_14_text_davinci_003_full_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 5797400
num_examples: 3669
- name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 1457941
num_examples: 3669
- name: fewshot_1__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 2808418
num_examples: 3669
- name: fewshot_3__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 5508030
num_examples: 3669
download_size: 2261474
dataset_size: 18642550
---
# Dataset Card for "OxfordPets_test_google_flan_t5_xl_mode_T_SPECIFIC_A_ns_3669"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jpqueiroz335/pul | ---
license: openrail
---
|
feynman-integrals-nn/t331ZZZM | ---
license: cc-by-4.0
---
* [data](https://huggingface.co/datasets/feynman-integrals-nn/t331ZZZM)
* [source](https://gitlab.com/feynman-integrals-nn/feynman-integrals-nn/-/tree/main/t331ZZZM)
|
Paia2/RaulBio | ---
license: openrail
---
|
cyberagent/crello | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license: cdla-permissive-2.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- unconditional-image-generation
task_ids: []
pretty_name: crello
tags:
- graphic design
- design templates
dataset_info:
features:
- name: id
dtype: string
- name: length
dtype: int64
- name: group
dtype:
class_label:
names:
'0': SM
'1': HC
'2': MM
'3': SMA
'4': EO
'5': BG
- name: format
dtype:
class_label:
names:
'0': Instagram Story
'1': Instagram
'2': Facebook
'3': Facebook cover
'4': Twitter
'5': Facebook AD
'6': Poster
'7': Instagram AD
'8': Tumblr
'9': Image
'10': Pinterest
'11': Flayer
'12': FB event cover
'13': Postcard
'14': Invitation
'15': Youtube
'16': Email header
'17': Medium Rectangle
'18': Poster US
'19': Graphic
'20': Large Rectangle
'21': Card
'22': Logo
'23': Title
'24': Skyscraper
'25': Leaderboard
'26': Presentation
'27': Gift Certificate
'28': VK Universal Post
'29': Youtube Thumbnail
'30': Business card
'31': Book Cover
'32': Presentation Wide
'33': VK Community Cover
'34': Certificate
'35': Zoom Background
'36': VK Post with Button
'37': T-Shirt
'38': Instagram Highlight Cover
'39': Coupon
'40': Letterhead
'41': IGTV Cover
'42': Schedule Planner
'43': Album Cover
'44': LinkedIn Cover
'45': Storyboard
'46': Recipe Card
'47': Invoice
'48': Resume
'49': Menu
'50': Mood Board
'51': Mind Map
'52': Label
'53': Newsletter
'54': Brochure
'55': Ticket
'56': Proposal
'57': Snapchat Geofilter
'58': Snapchat Moment Filter
'59': Twitch Offline Banner
'60': Twitch Profile Banner
'61': Infographic
'62': Mobile Presentation
'63': Photo Book
'64': Web Banner
'65': Gallery Image
'66': Calendar
- name: canvas_width
dtype:
class_label:
names:
'0': '1080'
'1': '1200'
'2': '940'
'3': '851'
'4': '360'
'5': '1190'
'6': '1920'
'7': '419'
'8': '1024'
'9': '600'
'10': '1600'
'11': '735'
'12': '595'
'13': '3000'
'14': '2560'
'15': '1500'
'16': '300'
'17': '540'
'18': '1296'
'19': '336'
'20': '500'
'21': '432'
'22': '560'
'23': '160'
'24': '1280'
'25': '728'
'26': '1000'
'27': '241'
'28': '1590'
'29': '792'
'30': '576'
'31': '537'
'32': '1008'
'33': '420'
'34': '1128'
'35': '396'
'36': '841'
'37': '800'
'38': '635'
'39': '240'
'40': '842'
- name: canvas_height
dtype:
class_label:
names:
'0': '1080'
'1': '1920'
'2': '315'
'3': '788'
'4': '628'
'5': '600'
'6': '504'
'7': '1683'
'8': '298'
'9': '500'
'10': '512'
'11': '1102'
'12': '1440'
'13': '200'
'14': '400'
'15': '250'
'16': '810'
'17': '1728'
'18': '1200'
'19': '280'
'20': '841'
'21': '288'
'22': '90'
'23': '1055'
'24': '720'
'25': '768'
'26': '700'
'27': '142'
'28': '612'
'29': '2560'
'30': '2000'
'31': '240'
'32': '216'
'33': '842'
'34': '1296'
'35': '2340'
'36': '654'
'37': '191'
'38': '1600'
'39': '297'
'40': '595'
'41': '480'
'42': '576'
'43': '320'
'44': '380'
'45': '141'
- name: category
dtype:
class_label:
names:
'0': holidaysCelebration
'1': foodDrinks
'2': fashionStyle
'3': businessFinance
'4': homeStuff
'5': handcraftArt
'6': beauty
'7': leisureEntertainment
'8': natureWildlife
'9': educationScience
'10': technology
'11': medical
'12': socialActivityCharity
'13': realEstateBuilding
'14': sportExtreme
'15': travelsVacations
'16': pets
'17': religions
'18': citiesPlaces
'19': industry
'20': transportation
'21': kidsParents
'22': all
- name: title
dtype: string
- name: type
sequence:
class_label:
names:
'0': svgElement
'1': textElement
'2': imageElement
'3': coloredBackground
'4': maskElement
- name: left
sequence: float32
- name: top
sequence: float32
- name: width
sequence: float32
- name: height
sequence: float32
- name: opacity
sequence: float32
- name: color
sequence:
sequence: float32
length: 3
- name: image
sequence: image
- name: text
sequence: string
- name: font
sequence:
class_label:
names:
'0': ''
'1': Montserrat
'2': Bebas Neue
'3': Raleway
'4': Josefin Sans
'5': Cantarell
'6': Playfair Display
'7': Oswald
'8': Blogger
'9': Abril Fatface
'10': Prompt
'11': Comfortaa
'12': Rubik
'13': Open Sans
'14': Roboto
'15': Libre Baskerville
'16': Quicksand
'17': Dosis
'18': Podkova
'19': Lato
'20': Cormorant Infant
'21': Amatic Sc
'22': Fjalla One
'23': Playlist Script
'24': Arapey
'25': Baloo Tamma
'26': Graduate
'27': Titillium Web
'28': Kreon
'29': Nunito
'30': Rammetto One
'31': Anton
'32': Poiret One
'33': Alfa Slab One
'34': Righteous
'35': Play
'36': Space Mono
'37': Frank Ruhl Libre
'38': Yanone Kaffeesatz
'39': Pacifico
'40': Bangers
'41': Yellowtail
'42': Droid Serif
'43': Racing Sans One
'44': Merriweather
'45': Miriam Libre
'46': Crete Round
'47': Rubik One
'48': Bungee
'49': Sansita One
'50': Patua One
'51': Economica
'52': Caveat
'53': Philosopher
'54': Limelight
'55': Breathe
'56': Rokkitt
'57': Russo One
'58': Noticia Text
'59': Tinos
'60': Oleo Script
'61': Josefin Slab
'62': Arima Madurai
'63': Brusher Free Font
'64': Old Standard Tt
'65': Kalam
'66': Patrick Hand
'67': Playball
'68': Six Caps
'69': Bad Script
'70': Orbitron
'71': Contrail One
'72': Selima Script
'73': Gravitas One
'74': El Messiri
'75': Bubbler One
'76': Italiana
'77': Pompiere
'78': Lemon Tuesday
'79': Vast Shadow
'80': Sunday
'81': Cookie
'82': Exo 2
'83': Barrio
'84': Radley
'85': Mrs Sheppards
'86': Grand Hotel
'87': Great Vibes
'88': Maven Pro
'89': Knewave
'90': Damion
'91': Tulpen One
'92': Parisienne
'93': Superclarendon Regular
'94': Oxygen
'95': Nixie One
'96': Permanent Marker
'97': Medula One
'98': Cabin Sketch
'99': Vollkorn
'100': Yeseva One
'101': Montserrat Alternates
'102': Satisfy
'103': Sacramento
'104': Carter One
'105': Glass Antiqua
'106': Mr Dafoe
'107': Lauren
'108': Oranienbaum
'109': Scope One
'110': Mr De Haviland
'111': Pirou
'112': Rise
'113': Sensei
'114': Yesteryear
'115': Delius
'116': Sue Ellen Francisco
'117': Copse
'118': Kaushan Script
'119': Monda
'120': Pattaya
'121': Dancing Script
'122': Reem Kufi
'123': Playlist Caps
'124': Beacon
'125': Reenie Beanie
'126': Overlock
'127': Mrs Saint Delafield
'128': Open Sans Condensed
'129': Covered By Your Grace
'130': Varela Round
'131': Allura
'132': Buda
'133': Mikodacs
'134': Arkana Script
'135': Nothing You Could Do
'136': Rochester
'137': Fredericka The Great
'138': Port Lligat Slab
'139': Heebo
'140': Arimo
'141': Dawning Of A New Day
'142': Aldrich
'143': Neucha
'144': Source Serif Pro
'145': Shadows Into Light Two
'146': Armata
'147': Cutive Mono
'148': Merienda One
'149': Rissa Typeface
'150': Stalemate
'151': Assistant
'152': Pathway Gothic One
'153': Breathe Press
'154': Suez One
'155': Berkshire Swash
'156': Rakkas
'157': Pinyon Script
'158': Pt Sans
'159': Delius Swash Caps
'160': Kurale
'161': Offside
'162': Clicker Script
'163': Mate
'164': Bentham
'165': Rye
'166': Lalezar
'167': Julius Sans One
'168': Quattrocento
'169': V T323
'170': Finger Paint
'171': La Belle Aurore
'172': Inconsolata
'173': Press Start 2P
'174': Junge
'175': Iceberg
'176': Kelly Slab
'177': Handlee
'178': Rosario
'179': Gaegu
'180': Homemade Apple
'181': Londrina Shadow
'182': Meddon
'183': Elsie Swash Caps
'184': Share Tech Mono
'185': Black Ops One
'186': Fauna One
'187': Alice
'188': Arizonia
'189': Text Me One
'190': Nova Square
'191': Bungee Shade
'192': Just Me Again Down Here
'193': Jacques Francois Shadow
'194': Cousine
'195': Forum
'196': Architects Daughter
'197': Cedarville Cursive
'198': Elsie
'199': Sirin Stencil
'200': Vampiro One
'201': Dorsa
'202': Marcellus Sc
'203': Kumar One
'204': Allerta Stencil
'205': Courgette
'206': Rationale
'207': Gluk Znikomitno25
'208': Happy Monkey
'209': Stint Ultra Expanded
'210': Rock Salt
'211': Im Fell Dw Pica Sc
'212': Faster One
'213': Bellefair
'214': Wire One
'215': Geo
'216': Farsan
'217': League Script
'218': Chathura
'219': Euphoria Script
'220': Zeyada
'221': Jura
'222': Loved By The King
'223': Give You Glory
'224': Znikomitno24
'225': Gluk Glametrix
'226': Alegreya Sans
'227': Kristi
'228': Knewave Outline
'229': Pangolin
'230': Okolaks
'231': Seymour One
'232': Didact Gothic
'233': Kavivanar
'234': Underdog
'235': Alef
'236': Italianno
'237': Londrina Sketch
'238': Secular One
'239': Katibeh
'240': Caesar Dressing
'241': Lovers Quarrel
'242': Iceland
'243': Im Fell
'244': Waiting For The Sunrise
'245': David Libre
'246': Marck Script
'247': Kumar One Outline
'248': Znikomit
'249': Monsieur La Doulaise
'250': Gruppo
'251': Monofett
'252': Gfs Didot
'253': Petit Formal Script
'254': Dukomdesign Constantine
'255': Brusher
'256': Eb Garamond
'257': Ewert
'258': Bilbo
'259': Raleway Dots
'260': Gabriela
'261': Ruslan Display
- name: font_size
sequence: float32
- name: text_align
sequence:
class_label:
names:
'0': ''
'1': left
'2': center
'3': right
- name: angle
sequence: float32
- name: capitalize
sequence:
class_label:
names:
'0': 'false'
'1': 'true'
- name: line_height
sequence: float32
- name: letter_spacing
sequence: float32
- name: suitability
sequence:
class_label:
names:
'0': mobile
- name: keywords
sequence: string
- name: industries
sequence:
class_label:
names:
'0': marketingAds
'1': entertainmentLeisure
'2': services
'3': retail
'4': businessFinance
'5': educationTraining
'6': foodBeverages
'7': artCrafts
'8': fashionStyle
'9': healthWellness
'10': ecologyNature
'11': nonProfitCharity
'12': techGadgets
'13': beautyCosmetics
'14': homeLiving
'15': familyKids
'16': travelTourism
'17': sportFitness
'18': corporate
'19': petsAnimals
'20': realEstateConstruction
'21': transportDelivery
'22': religionFaith
'23': hrRecruitment
- name: preview
dtype: image
- name: cluster_index
dtype: int64
splits:
- name: train
num_bytes: 5058614277.34
num_examples: 19095
- name: validation
num_bytes: 538185754.149
num_examples: 1951
- name: test
num_bytes: 649876234.375
num_examples: 2375
download_size: 6188050025
dataset_size: 6246676265.864
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# Dataset Card for Crello
## Table of Contents
- [Dataset Card for Crello](#dataset-card-for-crello)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [CanvasVAE github](https://github.com/CyberAgentAILab/canvas-vae)
- **Repository:**
- **Paper:** [CanvasVAE: Learning to Generate Vector Graphic Documents](https://arxiv.org/abs/2108.01249)
- **Leaderboard:**
- **Point of Contact:** [Kota Yamaguchi](https://github.com/kyamagu)
### Dataset Summary
The Crello dataset is compiled for the study of vector graphic documents. The dataset contains document meta-data such as canvas size and pre-rendered elements such as images or text boxes. The original templates were collected from [crello.com](https://crello.com) (now [create.vista.com](https://create.vista.com/)) and converted to a low-resolution format suitable for machine learning analysis.
### Usage
```python
import datasets
dataset = datasets.load_dataset("cyberagent/crello")
```
Old revision is available via `revision` option.
```python
import datasets
dataset = datasets.load_dataset("cyberagent/crello", revision="3.1")
```
### Supported Tasks and Leaderboards
[CanvasVAE](https://arxiv.org/abs/2108.01249) studies unsupervised document generation.
### Languages
Almost all design templates use English.
## Dataset Structure
### Data Instances
Each instance has scalar attributes (canvas) and sequence attributes (elements). Categorical values are stored as integer values. Check `ClassLabel` features of the dataset for the list of categorical labels.
```
{'id': '592d6c2c95a7a863ddcda140',
'length': 8,
'group': 4,
'format': 20,
'canvas_width': 3,
'canvas_height': 1,
'category': 0,
'title': 'Beauty Blog Ad Woman with Unusual Hairstyle',
'type': [1, 3, 3, 3, 3, 4, 4, 4],
'left': [0.0,
-0.0009259259095415473,
0.24444444477558136,
0.5712962746620178,
0.2657407522201538,
0.369228333234787,
0.2739444375038147,
0.44776931405067444],
'top': [0.0,
-0.0009259259095415473,
0.37037035822868347,
0.41296297311782837,
0.41296297311782837,
0.8946287035942078,
0.4549448788166046,
0.40591198205947876],
'width': [1.0,
1.0018517971038818,
0.510185182094574,
0.16296295821666718,
0.16296295821666718,
0.30000001192092896,
0.4990740716457367,
0.11388888955116272],
'height': [1.0,
1.0018517971038818,
0.25833332538604736,
0.004629629664123058,
0.004629629664123058,
0.016611294820904732,
0.12458471953868866,
0.02657807245850563],
'opacity': [1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0],
'text': ['', '', '', '', '', 'STAY WITH US', 'FOLLOW', 'PRESS'],
'font': [0, 0, 0, 0, 0, 152, 172, 152],
'font_size': [0.0, 0.0, 0.0, 0.0, 0.0, 18.0, 135.0, 30.0],
'text_align': [0, 0, 0, 0, 0, 2, 2, 2],
'angle': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
'capitalize': [0, 0, 0, 0, 0, 0, 0, 0],
'line_height': [1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0],
'letter_spacing': [0.0, 0.0, 0.0, 0.0, 0.0, 14.0, 12.55813980102539, 3.0],
'suitability': [0],
'keywords': ['beautiful',
'beauty',
'blog',
'blogging',
'caucasian',
'cute',
'elegance',
'elegant',
'fashion',
'fashionable',
'femininity',
'glamour',
'hairstyle',
'luxury',
'model',
'stylish',
'vogue',
'website',
'woman',
'post',
'instagram',
'ig',
'insta',
'fashion',
'purple'],
'industries': [1, 8, 13],
'color': [[153.0, 118.0, 96.0],
[34.0, 23.0, 61.0],
[34.0, 23.0, 61.0],
[255.0, 255.0, 255.0],
[255.0, 255.0, 255.0],
[255.0, 255.0, 255.0],
[255.0, 255.0, 255.0],
[255.0, 255.0, 255.0]],
'image': [<PIL.PngImagePlugin.PngImageFile image mode=RGBA size=256x256>,
<PIL.PngImagePlugin.PngImageFile image mode=RGBA size=256x256>,
<PIL.PngImagePlugin.PngImageFile image mode=RGBA size=256x256>,
<PIL.PngImagePlugin.PngImageFile image mode=RGBA size=256x256>,
<PIL.PngImagePlugin.PngImageFile image mode=RGBA size=256x256>,
<PIL.PngImagePlugin.PngImageFile image mode=RGBA size=256x256>,
<PIL.PngImagePlugin.PngImageFile image mode=RGBA size=256x256>,
<PIL.PngImagePlugin.PngImageFile image mode=RGBA size=256x256>]}
```
To get a label for categorical values, use the `int2str` method:
```python
data = dataset['train'] # obtain the train set
key = "font"
example = data[0] # obtain first sample in train set
data.features[key].feature.int2str(example[key]) # obtain the text equivalent of the encoded values
```
### Data Fields
In the following, categorical fields are shown as `categorical` type, but the actual storage is `int64`.
**Canvas attributes**
| Field | Type | Shape | Description |
| ------------- | ----------- | ------- | ----------------------------------------------------------------- |
| id | string | () | Template ID from crello.com |
| group | categorical | () | Broad design groups, such as social media posts or blog headers |
| format | categorical | () | Detailed design formats, such as Instagram post or postcard |
| category | categorical | () | Topic category of the design, such as holiday celebration |
| canvas_width | categorical | () | Canvas pixel width |
| canvas_height | categorical | () | Canvas pixel height |
| length | int64 | () | Length of elements |
| suitability | categorical | (None,) | List of display tags, only `mobile` tag exists |
| keywords | string | (None,) | List of keywords associated to this template |
| industries | categorical | (None,) | List of industry tags like `marketingAds` |
| preview | image | () | Preview image of the template for convenience; only for debugging |
| cluster_index | int64 | () | Cluster index used to split the dataset; only for debugging |
**Element attributes**
| Field | Type | Shape | Description |
| -------------- | ----------- | --------- | -------------------------------------------------------------------- |
| type | categorical | (None,) | Element type, such as vector shape, image, or text |
| left | float32 | (None,) | Element left position normalized to [0, 1] range w.r.t. canvas_width |
| top | float32 | (None,) | Element top position normalized to [0, 1] range w.r.t. canvas_height |
| width | float32 | (None,) | Element width normalized to [0, 1] range w.r.t. canvas_width |
| height | float32 | (None,) | Element height normalized to [0, 1] range w.r.t. canvas_height |
| color | int64 | (None, 3) | Extracted main RGB color of the element |
| opacity | float32 | (None,) | Opacity in [0, 1] range |
| image | image | (None,) | Pre-rendered 256x256 preview of the element encoded in PNG format |
| text | string | (None,) | Text content in UTF-8 encoding for text element |
| font | categorical | (None,) | Font family name for text element |
| font_size | float32 | (None,) | Font size (height) in pixels |
| text_align | categorical | (None,) | Horizontal text alignment, left, center, right for text element |
| angle | float32 | (None,) | Element rotation angle (radian) w.r.t. the center of the element |
| capitalize | categorical | (None,) | Binary flag to capitalize letters |
| line_height | float32 | (None,) | Scaling parameter to line height, default is 1.0 |
| letter_spacing | float32 | (None,) | Adjustment parameter for letter spacing, default is 0.0 |
Note that the color and pre-rendered images do not necessarily accurately reproduce the original design templates. The original template is accessible at the following URL if still available.
```
https://create.vista.com/artboard/?template=<template_id>
```
`left` and `top` can be negative because elements can be bigger than the canvas size.
### Data Splits
The Crello dataset has 3 splits: train, validation, and test. The current split is generated based on appearance-based clustering.
| Split | Count |
| --------- | ----- |
| train | 19095 |
| validaton | 1951 |
| test | 2375 |
### Visualization
Each example can be visualized in the following approach using [`skia-python`](https://kyamagu.github.io/skia-python/). Note the following does not guarantee a similar appearance to the original template. Currently, the quality of text rendering is far from perfect.
```python
import io
from typing import Any, Dict
import numpy as np
import skia
def render(features: datasets.Features, example: Dict[str, Any], max_size: float=512.) -> bytes:
"""Render parsed sequence example onto an image and return as PNG bytes."""
canvas_width = int(features["canvas_width"].int2str(example["canvas_width"]))
canvas_height = int(features["canvas_height"].int2str(example["canvas_height"]))
scale = min(1.0, max_size / canvas_width, max_size / canvas_height)
surface = skia.Surface(int(scale * canvas_width), int(scale * canvas_height))
with surface as canvas:
canvas.scale(scale, scale)
for index in range(example["length"]):
pil_image = example["image"][index]
image = skia.Image.frombytes(
pil_image.convert('RGBA').tobytes(),
pil_image.size,
skia.kRGBA_8888_ColorType)
left = example["left"][index] * canvas_width
top = example["top"][index] * canvas_height
width = example["width"][index] * canvas_width
height = example["height"][index] * canvas_height
rect = skia.Rect.MakeXYWH(left, top, width, height)
paint = skia.Paint(Alphaf=example["opacity"][index], AntiAlias=True)
angle = example["angle"][index]
with skia.AutoCanvasRestore(canvas):
if angle != 0:
degree = 180. * angle / np.pi
canvas.rotate(degree, left + width / 2., top + height / 2.)
canvas.drawImageRect(image, rect, paint=paint)
image = surface.makeImageSnapshot()
with io.BytesIO() as f:
image.save(f, skia.kPNG)
return f.getvalue()
```
## Dataset Creation
### Curation Rationale
The Crello dataset is compiled for the general study of vector graphic documents, with the goal of producing a dataset that offers complete vector graphic information suitable for neural methodologies.
### Source Data
#### Initial Data Collection and Normalization
The dataset is initially scraped from the former `crello.com` and pre-processed to the above format.
#### Who are the source language producers?
While [create.vista.com](https://create.vista.com/) owns those templates, the templates seem to be originally created by a specific group of design studios.
### Personal and Sensitive Information
The dataset does not contain any personal information about the creator but may contain a picture of people in the design template.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset was developed for advancing the general study of vector graphic documents, especially for generative systems of graphic design. Successful utilization might enable the automation of creative workflow that human designers get involved in.
### Discussion of Biases
The templates contained in the dataset reflect the biases appearing in the source data, which could present gender biases in specific design categories.
### Other Known Limitations
Due to the unknown data specification of the source data, the color and pre-rendered images do not necessarily accurately reproduce the original design templates. The original template is accessible at the following URL if still available.
https://create.vista.com/artboard/?template=<template_id>
## Additional Information
### Dataset Curators
The Crello dataset was developed by [Kota Yamaguchi](https://github.com/kyamagu).
### Licensing Information
The origin of the dataset is [create.vista.com](https://create.vista.com) (formally, `crello.com`).
The distributor ("We") do not own the copyrights of the original design templates.
By using the Crello dataset, the user of this dataset ("You") must agree to the
[VistaCreate License Agreements](https://create.vista.com/faq/legal/licensing/license_agreements/).
The dataset is distributed under [CDLA-Permissive-2.0 license](https://cdla.dev/permissive-2-0/).
**Note**
We do not re-distribute the original files as we are not allowed by terms.
### Citation Information
@article{yamaguchi2021canvasvae,
title={CanvasVAE: Learning to Generate Vector Graphic Documents},
author={Yamaguchi, Kota},
journal={ICCV},
year={2021}
}
### Releases
4.0.0: v4 release (Dec 5, 2023)
- Change the dataset split based on the template appearance to avoid near-duplicates: no compatibility with v3.
- Class labels have been reordered: no compabilitity with v3.
- Small improvement to font rendering.
3.1: bugfix release (Feb 16, 2023)
- Fix a bug that ignores newline characters in some of the texts.
3.0: v3 release (Feb 13, 2023)
- Migrate to Hugging Face Hub.
- Fix various text rendering bugs.
- Change split generation criteria for avoiding near-duplicates: no compatibility with v2 splits.
- Incorporate a motion picture thumbnail in templates.
- Add `title`, `keywords`, `suitability`, and `industries` canvas attributes.
- Add `capitalize`, `line_height`, and `letter_spacing` element attributes.
2.0: v2 release (May 26, 2022)
- Add `text`, `font`, `font_size`, `text_align`, and `angle` element attributes.
- Include rendered text element in `image_bytes`.
1.0: v1 release (Aug 24, 2021)
### Contributions
Thanks to [@kyamagu](https://github.com/kyamagu) for adding this dataset. |
Hermath/exam_data | ---
language:
- ko
--- |
mrajbrahma/bodo-words | ---
license: cc-by-sa-4.0
---
|
sboughorbel/mmlu_arabic | ---
dataset_info:
- config_name: abstract_algebra
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 30490
num_examples: 100
- name: dev
num_bytes: 1041
num_examples: 4
download_size: 27065856
dataset_size: 31531
- config_name: all
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 9797524
num_examples: 13942
- name: dev
num_bytes: 104891
num_examples: 224
download_size: 27065856
dataset_size: 9902415
- config_name: anatomy
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 48329
num_examples: 135
- name: dev
num_bytes: 1182
num_examples: 4
download_size: 27065856
dataset_size: 49511
- config_name: astronomy
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 70090
num_examples: 152
- name: dev
num_bytes: 2477
num_examples: 4
download_size: 27065856
dataset_size: 72567
- config_name: business_ethics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 50709
num_examples: 100
- name: dev
num_bytes: 2718
num_examples: 4
download_size: 27065856
dataset_size: 53427
- config_name: clinical_knowledge
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 95443
num_examples: 265
- name: dev
num_bytes: 1488
num_examples: 4
download_size: 27065856
dataset_size: 96931
- config_name: college_biology
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 74512
num_examples: 144
- name: dev
num_bytes: 1715
num_examples: 4
download_size: 27065856
dataset_size: 76227
- config_name: college_chemistry
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 37700
num_examples: 100
- name: dev
num_bytes: 1400
num_examples: 4
download_size: 27065856
dataset_size: 39100
- config_name: college_computer_science
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 65480
num_examples: 100
- name: dev
num_bytes: 2987
num_examples: 4
download_size: 27065856
dataset_size: 68467
- config_name: college_mathematics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 37194
num_examples: 100
- name: dev
num_bytes: 1292
num_examples: 4
download_size: 27065856
dataset_size: 38486
- config_name: college_medicine
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 126819
num_examples: 173
- name: dev
num_bytes: 2264
num_examples: 4
download_size: 27065856
dataset_size: 129083
- config_name: college_physics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 43195
num_examples: 102
- name: dev
num_bytes: 1424
num_examples: 4
download_size: 27065856
dataset_size: 44619
- config_name: computer_security
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 41302
num_examples: 100
- name: dev
num_bytes: 1543
num_examples: 4
download_size: 27065856
dataset_size: 42845
- config_name: conceptual_physics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 59163
num_examples: 235
- name: dev
num_bytes: 1135
num_examples: 4
download_size: 27065856
dataset_size: 60298
- config_name: default
features:
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: correct_answer
dtype: string
splits:
- name: validation
num_bytes: 213556
num_examples: 456
- name: test
num_bytes: 19534326
num_examples: 28084
download_size: 27157379
dataset_size: 19747882
- config_name: econometrics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 65489
num_examples: 114
- name: dev
num_bytes: 1960
num_examples: 4
download_size: 27065856
dataset_size: 67449
- config_name: electrical_engineering
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 37794
num_examples: 145
- name: dev
num_bytes: 1319
num_examples: 4
download_size: 27065856
dataset_size: 39113
- config_name: elementary_mathematics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 110038
num_examples: 378
- name: dev
num_bytes: 1741
num_examples: 4
download_size: 27065856
dataset_size: 111779
- config_name: formal_logic
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 69134
num_examples: 126
- name: dev
num_bytes: 2263
num_examples: 4
download_size: 27065856
dataset_size: 71397
- config_name: global_facts
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 31164
num_examples: 100
- name: dev
num_bytes: 957
num_examples: 4
download_size: 27065856
dataset_size: 32121
- config_name: high_school_biology
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 165901
num_examples: 310
- name: dev
num_bytes: 2373
num_examples: 4
download_size: 27065856
dataset_size: 168274
- config_name: high_school_chemistry
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 85920
num_examples: 203
- name: dev
num_bytes: 1774
num_examples: 4
download_size: 27065856
dataset_size: 87694
- config_name: high_school_european_history
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 386226
num_examples: 165
- name: dev
num_bytes: 1729
num_examples: 4
download_size: 27065856
dataset_size: 387955
- config_name: high_school_geography
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 62781
num_examples: 198
- name: dev
num_bytes: 1528
num_examples: 4
download_size: 27065856
dataset_size: 64309
- config_name: high_school_government_and_politics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 98812
num_examples: 193
- name: dev
num_bytes: 1861
num_examples: 4
download_size: 27065856
dataset_size: 100673
- config_name: high_school_macroeconomics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 177451
num_examples: 390
- name: dev
num_bytes: 1511
num_examples: 4
download_size: 27065856
dataset_size: 178962
- config_name: high_school_mathematics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 101776
num_examples: 270
- name: dev
num_bytes: 1239
num_examples: 4
download_size: 27065856
dataset_size: 103015
- config_name: high_school_microeconomics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 114084
num_examples: 238
- name: dev
num_bytes: 1132
num_examples: 4
download_size: 27065856
dataset_size: 115216
- config_name: high_school_physics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 82994
num_examples: 151
- name: dev
num_bytes: 1585
num_examples: 4
download_size: 27065856
dataset_size: 84579
- config_name: high_school_psychology
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 233162
num_examples: 545
- name: dev
num_bytes: 1829
num_examples: 4
download_size: 27065856
dataset_size: 234991
- config_name: high_school_statistics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 163142
num_examples: 216
- name: dev
num_bytes: 2178
num_examples: 4
download_size: 27065856
dataset_size: 165320
- config_name: high_school_us_history
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 429902
num_examples: 204
- name: dev
num_bytes: 1575
num_examples: 4
download_size: 27065856
dataset_size: 431477
- config_name: high_school_world_history
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 544111
num_examples: 237
- name: dev
num_bytes: 1573
num_examples: 4
download_size: 27065856
dataset_size: 545684
- config_name: human_aging
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 69769
num_examples: 223
- name: dev
num_bytes: 1218
num_examples: 4
download_size: 27065856
dataset_size: 70987
- config_name: human_sexuality
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 48542
num_examples: 131
- name: dev
num_bytes: 1328
num_examples: 4
download_size: 27065856
dataset_size: 49870
- config_name: international_law
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 78993
num_examples: 121
- name: dev
num_bytes: 2996
num_examples: 4
download_size: 27065856
dataset_size: 81989
- config_name: jurisprudence
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 48413
num_examples: 108
- name: dev
num_bytes: 1125
num_examples: 4
download_size: 27065856
dataset_size: 49538
- config_name: logical_fallacies
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 68738
num_examples: 163
- name: dev
num_bytes: 1725
num_examples: 4
download_size: 27065856
dataset_size: 70463
- config_name: machine_learning
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 50618
num_examples: 112
- name: dev
num_bytes: 2857
num_examples: 4
download_size: 27065856
dataset_size: 53475
- config_name: management
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 28137
num_examples: 103
- name: dev
num_bytes: 865
num_examples: 4
download_size: 27065856
dataset_size: 29002
- config_name: marketing
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 92784
num_examples: 234
- name: dev
num_bytes: 2155
num_examples: 4
download_size: 27065856
dataset_size: 94939
- config_name: medical_genetics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 31340
num_examples: 100
- name: dev
num_bytes: 1621
num_examples: 4
download_size: 27065856
dataset_size: 32961
- config_name: miscellaneous
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 224150
num_examples: 783
- name: dev
num_bytes: 937
num_examples: 4
download_size: 27065856
dataset_size: 225087
- config_name: moral_disputes
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 157552
num_examples: 346
- name: dev
num_bytes: 1794
num_examples: 4
download_size: 27065856
dataset_size: 159346
- config_name: moral_scenarios
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 626922
num_examples: 895
- name: dev
num_bytes: 2485
num_examples: 4
download_size: 27065856
dataset_size: 629407
- config_name: nutrition
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 145972
num_examples: 306
- name: dev
num_bytes: 2266
num_examples: 4
download_size: 27065856
dataset_size: 148238
- config_name: philosophy
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 113842
num_examples: 311
- name: dev
num_bytes: 940
num_examples: 4
download_size: 27065856
dataset_size: 114782
- config_name: prehistory
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 134220
num_examples: 324
- name: dev
num_bytes: 2242
num_examples: 4
download_size: 27065856
dataset_size: 136462
- config_name: professional_accounting
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 195436
num_examples: 282
- name: dev
num_bytes: 1959
num_examples: 4
download_size: 27065856
dataset_size: 197395
- config_name: professional_law
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 2696027
num_examples: 1534
- name: dev
num_bytes: 6414
num_examples: 4
download_size: 27065856
dataset_size: 2702441
- config_name: professional_medicine
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 334982
num_examples: 272
- name: dev
num_bytes: 2493
num_examples: 4
download_size: 27065856
dataset_size: 337475
- config_name: professional_psychology
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 323641
num_examples: 612
- name: dev
num_bytes: 1854
num_examples: 4
download_size: 27065856
dataset_size: 325495
- config_name: public_relations
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 42540
num_examples: 110
- name: dev
num_bytes: 1591
num_examples: 4
download_size: 27065856
dataset_size: 44131
- config_name: security_studies
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 303641
num_examples: 245
- name: dev
num_bytes: 5190
num_examples: 4
download_size: 27065856
dataset_size: 308831
- config_name: sociology
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 98095
num_examples: 201
- name: dev
num_bytes: 2075
num_examples: 4
download_size: 27065856
dataset_size: 100170
- config_name: us_foreign_policy
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 43826
num_examples: 100
- name: dev
num_bytes: 1934
num_examples: 4
download_size: 27065856
dataset_size: 45760
- config_name: virology
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 63088
num_examples: 166
- name: dev
num_bytes: 1323
num_examples: 4
download_size: 27065856
dataset_size: 64411
- config_name: world_religions
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 35949
num_examples: 171
- name: dev
num_bytes: 711
num_examples: 4
download_size: 27065856
dataset_size: 36660
---
|
arbml/AlRiyadh_Newspaper_Covid | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: string
- name: ID
dtype: string
- name: Category
dtype: string
- name: Source
dtype: string
- name: Title
dtype: string
- name: Subtitle
dtype: string
- name: Image
dtype: string
- name: Caption
dtype: string
- name: Text
dtype: string
- name: URL
dtype: string
- name: FullText
dtype: string
- name: FullTextCleaned
dtype: string
- name: FullTextWords
dtype: string
- name: WordsCounts
dtype: string
- name: Date
dtype: string
- name: Time
dtype: string
- name: Images
dtype: string
- name: Captions
dtype: string
- name: Terms
dtype: string
splits:
- name: train
num_bytes: 376546224
num_examples: 24084
download_size: 164286254
dataset_size: 376546224
---
# Dataset Card for "AlRiyadh_Newspaper_Covid"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/OK_VQA_google_flan_t5_xxl_mode_VQAv2_visclues_detection_caption_module_filter_ns_5046_OE | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 920304
num_examples: 5046
download_size: 356829
dataset_size: 920304
---
# Dataset Card for "OK_VQA_google_flan_t5_xxl_mode_VQAv2_visclues_detection_caption_module_filter_ns_5046_OE"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_BarraHome__LLaMarada-7B-v0.1-16bit | ---
pretty_name: Evaluation run of BarraHome/LLaMarada-7B-v0.1-16bit
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BarraHome/LLaMarada-7B-v0.1-16bit](https://huggingface.co/BarraHome/LLaMarada-7B-v0.1-16bit)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BarraHome__LLaMarada-7B-v0.1-16bit\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-21T04:08:48.061083](https://huggingface.co/datasets/open-llm-leaderboard/details_BarraHome__LLaMarada-7B-v0.1-16bit/blob/main/results_2024-02-21T04-08-48.061083.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.40037501295309713,\n\
\ \"acc_stderr\": 0.0340545031345763,\n \"acc_norm\": 0.40523725336590083,\n\
\ \"acc_norm_stderr\": 0.03486145533193288,\n \"mc1\": 0.22888616891064872,\n\
\ \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.3713093326467247,\n\
\ \"mc2_stderr\": 0.013334641945482336\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4786689419795222,\n \"acc_stderr\": 0.014598087973127106,\n\
\ \"acc_norm\": 0.5332764505119454,\n \"acc_norm_stderr\": 0.01457899585960581\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5493925512846046,\n\
\ \"acc_stderr\": 0.004965375341643133,\n \"acc_norm\": 0.7602071300537742,\n\
\ \"acc_norm_stderr\": 0.004260843849128677\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34814814814814815,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.34814814814814815,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.03999309712777471,\n\
\ \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.03999309712777471\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.38113207547169814,\n \"acc_stderr\": 0.029890609686286637,\n\
\ \"acc_norm\": 0.38113207547169814,\n \"acc_norm_stderr\": 0.029890609686286637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3611111111111111,\n\
\ \"acc_stderr\": 0.040166600304512336,\n \"acc_norm\": 0.3611111111111111,\n\
\ \"acc_norm_stderr\": 0.040166600304512336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.3236994219653179,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.0314108219759624,\n\
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.0314108219759624\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3724137931034483,\n \"acc_stderr\": 0.04028731532947558,\n\
\ \"acc_norm\": 0.3724137931034483,\n \"acc_norm_stderr\": 0.04028731532947558\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2751322751322751,\n \"acc_stderr\": 0.023000086859068646,\n \"\
acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.023000086859068646\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4064516129032258,\n\
\ \"acc_stderr\": 0.027941727346256315,\n \"acc_norm\": 0.4064516129032258,\n\
\ \"acc_norm_stderr\": 0.027941727346256315\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.032550867699701024,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.032550867699701024\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.03895658065271846,\n\
\ \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.03895658065271846\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.41414141414141414,\n \"acc_stderr\": 0.03509438348879629,\n \"\
acc_norm\": 0.41414141414141414,\n \"acc_norm_stderr\": 0.03509438348879629\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5129533678756477,\n \"acc_stderr\": 0.03607228061047749,\n\
\ \"acc_norm\": 0.5129533678756477,\n \"acc_norm_stderr\": 0.03607228061047749\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34102564102564104,\n \"acc_stderr\": 0.02403548967633508,\n\
\ \"acc_norm\": 0.34102564102564104,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02671924078371218,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02671924078371218\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3697478991596639,\n \"acc_stderr\": 0.03135709599613591,\n \
\ \"acc_norm\": 0.3697478991596639,\n \"acc_norm_stderr\": 0.03135709599613591\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.45871559633027525,\n \"acc_stderr\": 0.021364122533881695,\n \"\
acc_norm\": 0.45871559633027525,\n \"acc_norm_stderr\": 0.021364122533881695\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.21296296296296297,\n \"acc_stderr\": 0.02792096314799366,\n \"\
acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.02792096314799366\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.45588235294117646,\n \"acc_stderr\": 0.03495624522015474,\n \"\
acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.03495624522015474\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.540084388185654,\n \"acc_stderr\": 0.03244246810187913,\n \
\ \"acc_norm\": 0.540084388185654,\n \"acc_norm_stderr\": 0.03244246810187913\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.49327354260089684,\n\
\ \"acc_stderr\": 0.033554765962343545,\n \"acc_norm\": 0.49327354260089684,\n\
\ \"acc_norm_stderr\": 0.033554765962343545\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4580152671755725,\n \"acc_stderr\": 0.04369802690578756,\n\
\ \"acc_norm\": 0.4580152671755725,\n \"acc_norm_stderr\": 0.04369802690578756\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968431,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968431\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04826217294139892,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04826217294139892\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.39263803680981596,\n \"acc_stderr\": 0.03836740907831029,\n\
\ \"acc_norm\": 0.39263803680981596,\n \"acc_norm_stderr\": 0.03836740907831029\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.39805825242718446,\n \"acc_stderr\": 0.048467482539772386,\n\
\ \"acc_norm\": 0.39805825242718446,\n \"acc_norm_stderr\": 0.048467482539772386\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.594017094017094,\n\
\ \"acc_stderr\": 0.03217180182641086,\n \"acc_norm\": 0.594017094017094,\n\
\ \"acc_norm_stderr\": 0.03217180182641086\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.050241839379569095,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.050241839379569095\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5351213282247765,\n\
\ \"acc_stderr\": 0.017835798806290645,\n \"acc_norm\": 0.5351213282247765,\n\
\ \"acc_norm_stderr\": 0.017835798806290645\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4653179190751445,\n \"acc_stderr\": 0.026854257928258893,\n\
\ \"acc_norm\": 0.4653179190751445,\n \"acc_norm_stderr\": 0.026854257928258893\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.028452639985088006,\n\
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.028452639985088006\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.47266881028938906,\n\
\ \"acc_stderr\": 0.02835563356832818,\n \"acc_norm\": 0.47266881028938906,\n\
\ \"acc_norm_stderr\": 0.02835563356832818\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3950617283950617,\n \"acc_stderr\": 0.02720111766692565,\n\
\ \"acc_norm\": 0.3950617283950617,\n \"acc_norm_stderr\": 0.02720111766692565\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.29432624113475175,\n \"acc_stderr\": 0.0271871270115038,\n \
\ \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.0271871270115038\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3428943937418514,\n\
\ \"acc_stderr\": 0.012123463271585895,\n \"acc_norm\": 0.3428943937418514,\n\
\ \"acc_norm_stderr\": 0.012123463271585895\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.40808823529411764,\n \"acc_stderr\": 0.029855261393483924,\n\
\ \"acc_norm\": 0.40808823529411764,\n \"acc_norm_stderr\": 0.029855261393483924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3709150326797386,\n \"acc_stderr\": 0.01954210156485412,\n \
\ \"acc_norm\": 0.3709150326797386,\n \"acc_norm_stderr\": 0.01954210156485412\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n\
\ \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.509090909090909,\n\
\ \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.44081632653061226,\n \"acc_stderr\": 0.03178419114175363,\n\
\ \"acc_norm\": 0.44081632653061226,\n \"acc_norm_stderr\": 0.03178419114175363\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5323383084577115,\n\
\ \"acc_stderr\": 0.03528131472933607,\n \"acc_norm\": 0.5323383084577115,\n\
\ \"acc_norm_stderr\": 0.03528131472933607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.037891344246115496,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.037891344246115496\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.03786720706234213,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.03786720706234213\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22888616891064872,\n\
\ \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.3713093326467247,\n\
\ \"mc2_stderr\": 0.013334641945482336\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7095501183898973,\n \"acc_stderr\": 0.01275881344806461\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06974981046247157,\n \
\ \"acc_stderr\": 0.007016389571013844\n }\n}\n```"
repo_url: https://huggingface.co/BarraHome/LLaMarada-7B-v0.1-16bit
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|arc:challenge|25_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|gsm8k|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hellaswag|10_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T04-08-48.061083.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-21T04-08-48.061083.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- '**/details_harness|winogrande|5_2024-02-21T04-08-48.061083.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-21T04-08-48.061083.parquet'
- config_name: results
data_files:
- split: 2024_02_21T04_08_48.061083
path:
- results_2024-02-21T04-08-48.061083.parquet
- split: latest
path:
- results_2024-02-21T04-08-48.061083.parquet
---
# Dataset Card for Evaluation run of BarraHome/LLaMarada-7B-v0.1-16bit
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BarraHome/LLaMarada-7B-v0.1-16bit](https://huggingface.co/BarraHome/LLaMarada-7B-v0.1-16bit) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BarraHome__LLaMarada-7B-v0.1-16bit",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-21T04:08:48.061083](https://huggingface.co/datasets/open-llm-leaderboard/details_BarraHome__LLaMarada-7B-v0.1-16bit/blob/main/results_2024-02-21T04-08-48.061083.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.40037501295309713,
"acc_stderr": 0.0340545031345763,
"acc_norm": 0.40523725336590083,
"acc_norm_stderr": 0.03486145533193288,
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.3713093326467247,
"mc2_stderr": 0.013334641945482336
},
"harness|arc:challenge|25": {
"acc": 0.4786689419795222,
"acc_stderr": 0.014598087973127106,
"acc_norm": 0.5332764505119454,
"acc_norm_stderr": 0.01457899585960581
},
"harness|hellaswag|10": {
"acc": 0.5493925512846046,
"acc_stderr": 0.004965375341643133,
"acc_norm": 0.7602071300537742,
"acc_norm_stderr": 0.004260843849128677
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40789473684210525,
"acc_stderr": 0.03999309712777471,
"acc_norm": 0.40789473684210525,
"acc_norm_stderr": 0.03999309712777471
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.38113207547169814,
"acc_stderr": 0.029890609686286637,
"acc_norm": 0.38113207547169814,
"acc_norm_stderr": 0.029890609686286637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.040166600304512336,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.040166600304512336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.0314108219759624,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.0314108219759624
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3724137931034483,
"acc_stderr": 0.04028731532947558,
"acc_norm": 0.3724137931034483,
"acc_norm_stderr": 0.04028731532947558
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.023000086859068646,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.023000086859068646
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4064516129032258,
"acc_stderr": 0.027941727346256315,
"acc_norm": 0.4064516129032258,
"acc_norm_stderr": 0.027941727346256315
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.032550867699701024,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.032550867699701024
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.03895658065271846,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.03895658065271846
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.41414141414141414,
"acc_stderr": 0.03509438348879629,
"acc_norm": 0.41414141414141414,
"acc_norm_stderr": 0.03509438348879629
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5129533678756477,
"acc_stderr": 0.03607228061047749,
"acc_norm": 0.5129533678756477,
"acc_norm_stderr": 0.03607228061047749
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34102564102564104,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.34102564102564104,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02671924078371218,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02671924078371218
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3697478991596639,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.3697478991596639,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.45871559633027525,
"acc_stderr": 0.021364122533881695,
"acc_norm": 0.45871559633027525,
"acc_norm_stderr": 0.021364122533881695
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.02792096314799366,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.02792096314799366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.45588235294117646,
"acc_stderr": 0.03495624522015474,
"acc_norm": 0.45588235294117646,
"acc_norm_stderr": 0.03495624522015474
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.540084388185654,
"acc_stderr": 0.03244246810187913,
"acc_norm": 0.540084388185654,
"acc_norm_stderr": 0.03244246810187913
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.49327354260089684,
"acc_stderr": 0.033554765962343545,
"acc_norm": 0.49327354260089684,
"acc_norm_stderr": 0.033554765962343545
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4580152671755725,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.4580152671755725,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968431,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968431
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04826217294139892,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04826217294139892
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.39263803680981596,
"acc_stderr": 0.03836740907831029,
"acc_norm": 0.39263803680981596,
"acc_norm_stderr": 0.03836740907831029
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.39805825242718446,
"acc_stderr": 0.048467482539772386,
"acc_norm": 0.39805825242718446,
"acc_norm_stderr": 0.048467482539772386
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.594017094017094,
"acc_stderr": 0.03217180182641086,
"acc_norm": 0.594017094017094,
"acc_norm_stderr": 0.03217180182641086
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.050241839379569095,
"acc_norm": 0.49,
"acc_norm_stderr": 0.050241839379569095
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5351213282247765,
"acc_stderr": 0.017835798806290645,
"acc_norm": 0.5351213282247765,
"acc_norm_stderr": 0.017835798806290645
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4653179190751445,
"acc_stderr": 0.026854257928258893,
"acc_norm": 0.4653179190751445,
"acc_norm_stderr": 0.026854257928258893
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.028452639985088006,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.028452639985088006
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.47266881028938906,
"acc_stderr": 0.02835563356832818,
"acc_norm": 0.47266881028938906,
"acc_norm_stderr": 0.02835563356832818
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3950617283950617,
"acc_stderr": 0.02720111766692565,
"acc_norm": 0.3950617283950617,
"acc_norm_stderr": 0.02720111766692565
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.0271871270115038,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.0271871270115038
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3428943937418514,
"acc_stderr": 0.012123463271585895,
"acc_norm": 0.3428943937418514,
"acc_norm_stderr": 0.012123463271585895
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.40808823529411764,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.40808823529411764,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3709150326797386,
"acc_stderr": 0.01954210156485412,
"acc_norm": 0.3709150326797386,
"acc_norm_stderr": 0.01954210156485412
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.44081632653061226,
"acc_stderr": 0.03178419114175363,
"acc_norm": 0.44081632653061226,
"acc_norm_stderr": 0.03178419114175363
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5323383084577115,
"acc_stderr": 0.03528131472933607,
"acc_norm": 0.5323383084577115,
"acc_norm_stderr": 0.03528131472933607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.037891344246115496,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.037891344246115496
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.03786720706234213,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.03786720706234213
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.3713093326467247,
"mc2_stderr": 0.013334641945482336
},
"harness|winogrande|5": {
"acc": 0.7095501183898973,
"acc_stderr": 0.01275881344806461
},
"harness|gsm8k|5": {
"acc": 0.06974981046247157,
"acc_stderr": 0.007016389571013844
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Coooori/instruction_data_test_hf | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1185067
num_examples: 1099
download_size: 228178
dataset_size: 1185067
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "instruction_data_test_hf"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlekseyKorshuk/code-alpaca-eval-v0-deepseek-coder-7b-instruct-v1.5-annotations | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: model_input
list:
- name: content
dtype: string
- name: role
dtype: string
- name: baseline_response
dtype: string
- name: deepseek-coder-7b-instruct-v1.5_response
dtype: string
- name: deepseek-coder-7b-instruct-v1.5_annotation
dtype: float64
splits:
- name: train
num_bytes: 692025
num_examples: 134
download_size: 302809
dataset_size: 692025
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CVasNLPExperiments/Hatefulmemes_validation_google_flan_t5_xxl_mode_A_OCR_rices_ns_500 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full__text
num_bytes: 436166
num_examples: 500
- name: fewshot_0
num_bytes: 452042
num_examples: 500
download_size: 140571
dataset_size: 888208
---
# Dataset Card for "Hatefulmemes_validation_google_flan_t5_xxl_mode_A_OCR_rices_ns_500"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_sst2_his_him | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 6561
num_examples: 42
- name: test
num_bytes: 17298
num_examples: 102
- name: train
num_bytes: 204419
num_examples: 1660
download_size: 112466
dataset_size: 228278
---
# Dataset Card for "MULTI_VALUE_sst2_his_him"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mHossain/text_summary_v2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: text
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 2591192.7
num_examples: 540
- name: test
num_bytes: 287910.3
num_examples: 60
download_size: 1754170
dataset_size: 2879103.0
---
# Dataset Card for "text_summary_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Memin25/bigdatasets | ---
dataset_info:
features:
- name: review
dtype: string
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 3789237.142175591
num_examples: 46406
- name: validation
num_bytes: 421089.857824409
num_examples: 5157
download_size: 2280330
dataset_size: 4210327.0
---
# Dataset Card for "bigdataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
allen0523/robotphoto | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 241008411.0
num_examples: 300
download_size: 240936232
dataset_size: 241008411.0
---
# Dataset Card for "robotphoto"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
StrangeCroissant/fantasy_dataset | ---
task_categories:
- text-generation
- question-answering
language:
- en
tags:
- books
- fantasy
- scifi
- text
size_categories:
- 10K<n<100K
---
# Fantasy/Sci-fi Dataset
This dataset contains fantasy and scifi books in plain text format. Each line of the dataset represents each sentence of the concated corpus for the following books:
1. 01 Horselords.txt
2. 01 The Second Generation.txt 02 Tantras.txt
3. R.A. Salvatore - The Icewind Dale Trilogy - 2 - Streams of Silver.txt
4. RA SalvatoreThe Legacy of The Drow - 2 - Starless Night.txt
5. R.A.Salvatore - Icewind Dale Trilogy 1 - The Crystal Shard.txt
6. Star Wars - [Thrawn Trilogy 02] - Dark Force Rising (by Timothy Zahn).txt
7. Robert Jordan - The Wheel of Time 01 - Eye of the world.txt
8. 03 Crusade.txt
9. Salvatore, RA - Cleric Quintet 5 -The Chaos Curse.txt
10. 03 Waterdeep.txt Clarke Arthur C - 3001 The Final Odissey.txt
11. Dragonlance Preludes 2 vol 2 - Flint the King.txt
12. 03 Dragons of Spring Dawning.txt
13. Lloyd Alexander - [Chronicles Of Prydain 4] Taran Wanderer.txt
14. 01 Dragons of Autumn Twilight.txt
15. 03 The Two Swords.txt
16. Robert Jordan - 12 - The Gathering Storm - Chapter One.txt
17. 02 War Of The Twins.txt
18. 01 - The Fellowship Of The Ring.txt
19. 02 The Lone Drow.txt
20. 01 The Thousand Orcs.txt Auel, Jean - Earth's Children
21. 03 - The Mammoth Hunters.txt 01 Shadowdale.txt Salvatore, RA - Cleric Quintet 3 - Night Masks.txt
22. Robert Jordan - The Strike at Shayol Ghul.txt
23. Salvatore, R.A. - Paths of Darkness 1 - The Silent Blade.txt
24. Clancy Tom - Patriot Games.txt
25. Lloyd Alexander - [Chronicles Of Prydain 1] Book of Three.txt
26. Lloyd Alexander - [Chronicles Of Prydain 2] Black Cauldron.txt
27. Salvatore, R.A. - Paths of Darkness 3 - Servant of the Shard.txt
28. 02 Crown of Fire.txt
29. 04 Prince of Lies.txt
30. Salvatore, R.A. - Paths of Darkness 2 - The Spine of the World.txt
31. Robert Jordan - The Wheel of Time 11 - Knife of Dreams.txt
32. Lloyd Alexander - [Chronicles Of Prydain 3] Castle Of Llyr.txt R.A. Salvatore - The Dark Elf Trilogy.txt
33. 02 Dragonwall.txt Frank Herbert - Dune.txt
34. 02 - The Two Towers.txt
35. Salvatore, RA - Cleric Quintet 4 - The Fallen Fortress.txt
36. Robert Jordan - The Wheel of Time 04 - The Shadow Rising.txt
37. Robert Jordan - The Wheel of Time 10 - Crossroads of Twilight.txt
38. Harry Potter 2 - Chamber of Secrets.txt
39. Auel, Jean - Earth's Children 01 - The Clan of the Cave Bear.txt
40. Harry Potter 6 - The Half Blood Prince.txt
41. Robert Jordan - The Wheel of Time 03 - The Dragon Reborn.txt
42. R.A. Salvatore - The Legacy of the Drow 1 - Legacy.txt
43. 01 Spellfire.txt Frank Herbert - Children of Dune.txt
44. 01 Time Of The Twins.txt
45. R.A. Salvatore - The Legacy of the Drow III - Siege of Darkness.txt
46. Robert Jordan - The Wheel of Time 08 - The Path of Daggers.txt
47. R.A. Salvatore - The Icewind Dale Trilogy - 3 - The Halfling's Gem.txt
48. Auel, Jean - Earth's Children 05 - The Shelters Of Stone.txt
49. Harry Potter 7 - Deathly Hollows.txt
50. Robert Jordan - The Wheel of Time 07 - A Crown of Swords.txt
51. Harry Potter 1 - Sorcerer's Stone.txt
52. 05 Crucible - The Trial Of Cyric The Mad.txt Star Wars - [Thrawn Trilogy 01] - Heir to the Empire (by Timothy Zahn).txt
53. Robert Jordan - The Wheel of Time 05 - The Fires of Heaven.txt Robert Jordan - The Wheel of Time Compendium.txt
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.